Chrome freezes when playing videos from Rackspace cloudfiles - html

Can't seem to get chrome to play videos with html5 video tag when I host them on a Rackspace cloudfiles server.
Works perfectly on the regular hosting, but as soon as I link the video with the rackspace cdn url, Chrome freezes (total freeze, website UI completely blocked - after a while Chrome pops a message saying "The following page has become unresponsive bla bla bla").
The video file is fine as it's same as when I link to the regular hosting.
Did a bit of spying on the requests, and I initially thought the problem was that the webm files were serverd by default as application/octet-stream mime-type. I lodged a ticket to rackspace and they gave me a way to force the mime-type when uploading the file. Did that, and the file is now correctly sent as video/webm.. but Chrome still freezes.
Any idea what could be going wrong here?
EDIT: using iheartvideo, loading the video from rackspace triggers a MEDIA_ERR_SRC_NOT_SUPPORTED. Same video off local web server works totally fine (??)
EDIT 2: Happens on both mac and windows with latest mainstream chrome
EDIT 3: curl -I results:
Rackspace (no worky):
HTTP/1.1 200 OK
Server: nginx/0.7.65
Content-Type: video/webm
Last-Modified: Thu, 24 Feb 2011 23:45:12 GMT
ETag: 7029f83b241aa691859012dfa047e20d
Content-Length: 20173074
Cache-Control: public, max-age=900
Expires: Fri, 25 Feb 2011 01:32:11 GMT
Date: Fri, 25 Feb 2011 01:17:11 GMT
Connection: keep-alive
Web Server (worky)
HTTP/1.1 200 OK
Date: Fri, 25 Feb 2011 01:17:51 GMT
Server: Apache
Last-Modified: Thu, 24 Feb 2011 03:56:26 GMT
ETag: "11a0b47-133d112-49cff32940e80"
Accept-Ranges: bytes
Content-Length: 20173074
Content-Type: text/plain
EDIT 4: For those interested, this is what the rackscape crew told me to do to set a webm content type on a file:
The file browser is not smart enough
to determine the content type
video/webm. Unfortunately, there is
not a way to change the content type
of a file that has already been
uploaded.
You'll need to use one of the API to
re-upload your files with the correct
content type.
You can also use curl from a
linux/MacOS command line if available.
Using your username and api key run
this command...
curl -I -X GET -H "X-Auth-User: USERNAME" -H "X-Auth-Key: API_KEY" https://auth.api.rackspacecloud.com/v1.0
From the output there are 2 important
values.
X-Storage-Url: https://storage101.......
X-Storage-Token: Long hash
You can upload the files with,
curl -X PUT -T test.webm -H "Content-Type: video/webm" -H "Content-Length: FILESIZEINBYTE" -H "X-Auth-Token: TOKEN FROM RESPONSE ABOVE" https://STORAGE URL FROM RESPONSE ABOVE/test.webm
You must specify the content type, and
you must give the correct length of
bytes of what is being uploaded. If
not you will get an invalid request
error.

I work quite a bit with the Rackspace API.
Their API actually allows you to set a container as streaming enabled. My first instinct tells me that you haven't done this.
I stream various file types and they all work an absolute treat.
There is more information about CDN Streaming Enabled containers here:
http://docs.rackspace.com/files/api/v1/cf-devguide/content/Streaming-CDN-Enabled_Containers-d1f3721.html
Hopefully this helps, but if not let me know and I don't mind putting some PHP example code together to help you along.
Its all quite easy, but getting your head around the various API operations that Rackspace have implemented can sometimes be a daunting task.

I don't have a concrete answer, just some thoughts:
what about in other browsers?
it works on the web server with the content type above is text/plain so why force to video/webm?
can Rackspace provide you (or can you find on their site or someone else's) some sample content that does play so you can inspect it?
You could try the free trial from Brightcove or Bitgravity and see if that works...

Related

save and serve bundle.js as gzip version only

I have a react app and save the bundle.js on a CDN (or S3 for this example)
on save I run gzip -9
on upload to CDN / S3 i add headers: Content-Encoding: gzip
now each time a browser / http client will download the bundle it will get:
curl -I https://cdn.example.com/bundle.min.js
HTTP/2 200
content-type: application/javascript
content-length: 3304735
date: Wed, 27 Feb 2019 22:27:19 GMT
last-modified: Wed, 27 Feb 2019 22:26:53 GMT
content-encoding: gzip
accept-ranges: bytes
this works fine if I test this in a browser. my only concern is that now we only save a gzip version of the js bundle and users will get it regardless of sending over the Accept-Encoding: gzip in the request
I cant think of any issues this will cause for browsers but I might be missing something.
Is it a bad practice to "enforce" gzip in the response for the bundle.js file ?
Its several month late, but the answer may still be relevant to someone trying the same thing.
Pre-compression with high compression settings is something that can help with saving few more percentages of bandwidth on static resources. However, forcing a single encoding may create problem for some users. As per a study conducted in 2010, ~15% of users with gzip-capable browsers were not sending an appropriate Accept-Encoding request header. The reason being anti-virus software or intermediaries/proxies striping Accept-Encoding header to force the server to send the content in plain-text form.
If you need a CDN that covers the gap, PageCDN serves exactly the same purpose, but offers superior brotli-11 compression, and falls back to gzip for browsers that do not support brotli. You do not need to connect to S3 or any external storage. Just connect to your website, github, or manually upload files to CDN and configure the compression level for files.

How to debug mod_perl with reverse proxy of mod_proxy?

I'm new to this with a very low background so bear with me. I want to debug my issue of prematurely terminating loading the page from the reverse proxied site as it request something like filename.json file. I noticed in the request header it has Content-Type:application/x-www-form-urlencoded, and the response headers are
$WSEP:
Cache-Control:no-cache
Connection:Keep-Alive
Content-Encoding:gzip
Content-Language:en-US
Content-Length:55
Content-Type:text/html;charset=UTF-8
Date:Sun, 12 Jun 2016 04:54:18 GMT
Expires:Thu, 01 Jan 1970 00:00:00 GMT
Keep-Alive:timeout=5, max=97
Server:Apache
Vary:Accept-Encoding
X-Cnection:Close
but I'm wondering why it's text/html content-type when it's json, so I tried manipulating the response with application/json and even trying to instead of downloading the page from the remote server, I made it download locally from my server /var/www/html by creating dummy json file. I was thinking because this is json file it should be application/json not text/html as Content-Type. But the issue still exists. It just stops when it reached requesting that json file.
Now my question is, how can I make sure that requesting to that json is the culprit? Or more appropriately, how can I debug the mod_perl. I tried adding
use warning;
in the code
PerlWarn On
in the apache config but still I don't see any warning/errors that will guide me to debug.
Can you give me advice on how to do find the fault?
By the way, the site of that json file being requested returns 404 but I assume that the proxy server should ignore it and continue downloading other files.

Chromium CSS Link from CDN ignored

I'm having a sudden problem when loading my site. http://busme.us. with a newer version of Chromium.
It appears that my one of my stylesheets is not being applied to the elements as I can figure out from Inspect Element.
The link in question is the following:
<link href="http://objects.dreamhost.com/busme-prod-files/main/538e298e22f324388b007ab8/original/style.css" rel="stylesheet">
However, the main CSS seems to be processed and is assigning CSS to elements, which is:
<link href="//busme-prod-assets.objects.dreamhost.com/assets/normal-layout-3df2197619a7f407056e8569a5b5819f.css" media="all" rel="stylesheet" type="text/css">
It all works on Firefox and Safari. It does NOT work on the following versions of Chromium.
Version 45.0.2454.85 Ubuntu 14.04 (64-bit)
Version 44.0.2403.89 Ubuntu 14.04 (64-bit)
These were upgraded from previous Chromium browsers that did work.
So, something changed in the upgrade of Chromium? I don't have older versions to test anymore since I upgraded.
If I download the contents of style.css and using Developer Tools I insert a <style> element containing the contents of style.css, it works. So, I believe it is not a syntax issue particular to Chrome.
Furthermore, I have seen other answers, such as the CDN is returning the wrong content type of binary/octet-stream, but this content type is the same for both of the CSS files in question, so why does one work and not the other? Anyway, I cannot really change the response type since I don't control the CDN.
Before you start pointing out the type="text/css" and media="all" and http: differences in the links, I've changed those as well. Inspect Element says it has downloaded the style.css file and it states that its "Type" is 'stylesheet', same for normal-layout.css
I've even made changes to make the links look exactly the same by downloading the page into a file, editing it locally, and loading it from the file with the same results. Try it!
So, I'm really at a loss of what is going on.
Why is it only Chrome? What is the incompatibility with other browsers?
Thanks.
As mentioned in a comment above, the problem is actually the Content-type header.
Specifically, the http://objects.dreamhost.com/busme-prod-files/main/538e298e22f324388b007ab8/original/style.css file is being served with a Content-type: undefined header. It should be Content-type: text/css.
The other file you mention, http://busme-prod-assets.objects.dreamhost.com/assets/normal-layout-3df2197619a7f407056e8569a5b5819f.css, is served with Content-type: text/css, and that’s why it works as expected.
Verifying Content-type on your own
To verify headers yourself, you can use the curl command with the -I option, like this:
curl -I http://objects.dreamhost.com/busme-prod-files/main/538e298e22f324388b007ab8/original/style.css
That produces this output:
HTTP/1.1 200 OK
Content-Length: 16795
Accept-Ranges: bytes
Last-Modified: Tue, 03 Jun 2014 20:01:18 GMT
ETag: "06b3748eba4f2e5ccb70e8444abd5c77"
Content-type: undefined
Date: Thu, 24 Sep 2015 21:33:18 GMT
Compare that to running this:
curl -I http://busme-prod-assets.objects.dreamhost.com/assets/normal-layout-3df2197619a7f407056e8569a5b5819f.css
That produces this:
HTTP/1.1 200 OK
Content-Length: 239447
Accept-Ranges: bytes
Last-Modified: Thu, 07 May 2015 05:44:24 GMT
ETag: "e4b17d7e6405120b1488079bb0081e6f"
Cache-Control: public, max-age=31557600
Expires: Fri, 06 May 2016 11:44:23 GMT
Content-type: text/css
Date: Thu, 24 Sep 2015 21:33:36 GMT
If you don’t have the curl command installed but do have the wget command, you can get similar information with wget -S, like this:
wget -S http://objects.dreamhost.com/busme-prod-files/main/538e298e22f324388b007ab8/original/style.css
Furthermore, I have seen other answers, such as the CDN is returning
the wrong content type of binary/octet-stream, but this content type
is the same for both of the CSS files in question, so why does one
work and not the other?
As the curl -I output above indicates, the Content-type for the files you mention is not actually the same; one has the correct Content-type: text/css header, while the other doesn't.
If you’re able to serve the file from http://busme-prod-assets.objects.dreamhost.com with the correct Content-type: text/css header, then you should you be able to ensure that the curl -I http://objects.dreamhost.com/busme-prod-files files also get the right Content-type.

Why isn't the browser loading cdn file from cache?

Here is a very simple example to illustrate my question using JQuery from a CDN to modify the page:
<html>
<body>
<p>Hello Dean!</p>
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/2.1.3/jquery.min.js"></script>
<script>$("p").html("Hello, Gabe!")</script>
</body>
</html>
When you load this page with an internet connection, the page displays "Hello Gabe". When I then turn off the internet connection, the page displays "Hello Dean" with an error -- JQuery is not available.
My understanding is that CDNs have a long Cache-Control and Expire in the header response, which I understand to mean that the browser caches the file locally.
$ curl -s -D - https://cdnjs.cloudflare.com/ajax/libs/jquery/2.1.3/jquery.min.js | head
HTTP/1.1 200 OK
Server: cloudflare-nginx
Date: Fri, 17 Apr 2015 16:30:33 GMT
Content-Type: application/javascript
Transfer-Encoding: chunked
Connection: keep-alive
Last-Modified: Thu, 18 Dec 2014 17:00:38 GMT
Expires: Wed, 06 Apr 2016 16:30:33 GMT
Cache-Control: public, max-age=30672000
But this does not appear to be happening. Can someone please explain what is going on? Also -- how can I get the browser to use the copy of JQuery in the cache somewhere?
This question came up because we want to be using CDN's to serve external libraries, but also want to be able to develop the page offline -- like on an airplane.
I get similar behavior using Chrome and Firefox.
There's nothing to deal with CDN. When then browser encounters a script tag, it will request it to the server, whether it's hosted on a CDN or on your server. If the browser previously loaded it, on the same address, the server tells whether it should be reloaded or not (sending 304 HTTP status code).
What you are probably looking for is to cache your application for offline use. It's possible with HTML5 cache manifest file. You need to create a file listing all files needed to be cached for explicit offline use
Since the previous answer recommended using a cache manifest file, I just wanted to make people aware that this feature is being dropped from the web standards.
Info available from Mozilla:
https://developer.mozilla.org/en-US/docs/Web/HTML/Using_the_application_cache
It's recommended to use Service workers instead of the cache manifest.

ios7 memory issue loading json file via angularjs (Ionic) httpPromise.then

I have an odd issue. I an using angularjs (Ionic) to load an external json file via a httpPromise.
All has been working correctly until yesterday when the remote files were moved to another host.
Now the problem is on an iphone4 running ios7 it tries to load the file but can't and crashes out with memory usage issues. inspecting it using xcode it quickly climbs to over 300mb and then crashes. does the same thing on two devices. runs fine on other phone emulators etc.
Now if I host the file on another server it works as expected.
the different response headers are:
the one that fails:
Accept-Ranges bytes
Connection close
Content-Length 721255
Content-Type application/json
Date Thu, 11 Dec 2014 06:04:15 GMT
Last-Modified Thu, 11 Dec 2014 05:12:57 GMT
Server LiteSpeed
Vary User-Agent
Working host:
Accept-Ranges bytes
Connection keep-alive
Content-Encoding gzip
Content-Type application/json
Date Thu, 11 Dec 2014 06:05:01 GMT
Last-Modified Thu, 11 Dec 2014 03:29:48 GMT
Server nginx/1.6.2
Transfer-Encoding chunked
Vary Accept-Encoding,User-Agent
Code used to get json file.
var deferred = $q.defer(),
httpPromise = $http.get('http://someurl.com.au/deals.json');
httpPromise.then(function success(response) {
so after all of that my question is why would the json file not load or return an error of some sort?
But cause such a memory spike?
The only main difference I see between servers is the connection configuration.
working uses Connection keep-alive and the one that fails is closed.
thoughts?
Additionally, I've just tried it on 3g and it works fine but via wireless it doesn't work?
i think you need to focus on the Content-Encoding gzip which is probably what is saving you from memory issues