How to accurately measure HTML5 Browser Framerates (FPS)? - html

What is the most accurate way to measure framerates, i.e. FPS, in modern HTML5 browsers? I'm specifically interested in FPS for Canvas animations.
http://weblogs.mozillazine.org/roc/archives/2010/11/measuring_fps.html will tell you that trying to measure framerate by counting how often your setTimeout runs is not accurate. The browser can run your Timeout callback multiple times between screen paints.
Turns out Mozilla has a window.mozPaintCount https://developer.mozilla.org/en/DOM/window.mozPaintCount available, which should provide an accurate FPS. However, this only works for Mozilla.
There's an open issue for Chrome for something similar: http://code.google.com/p/chromium/issues/detail?id=65348
A manual way to check for hardware accelerated FPS in Chrome is to grab the Chrome Beta channel (as of posting date) and go to about:flags and turn on FPS Counter. However, on a Mac, acceleration only turns on when using WebGL. So, no way to check FPS for Canvas on Chrome for Mac.
What are other strategies for accurately measuring HTML5 FPS?
Thanks!

Please check:
https://github.com/mrdoob/stats.js - it's the best FPS monitor I know. It also gives you some stats about mem/cpu usage (you have to run your browser with special parameter to expose that data), but may also suffer from the inaccuracy you described.
https://github.com/pcwalton/firefox-framerate-monitor
Also, in new chrome builds (probably canary stream) there should be an option for displaying FPS in about:flags.

Related

HTML5 video buffering despite waiting for `canplaythrough` event to `play()`

I'm trying to ensure (to the extent possible) that an HTML5 video begins playing only only when it is able to play through completely without buffering. For context, the MediaStream of the video is then used to mix with another audio source and sent over peer WebRTC connections. The videos are typically 5-10MB and a few minutes long (i.e. a decent broadband connection should have no trouble loading the entire video well before it's done playing).
To achieve this, my code currently waits for the canplaythrough event on the video element to begin and calls play() when it fires.
This "works" in the sense that the video begins playing and, in most cases, buffering is sufficient for the video to play through uninterrupted. But, in a few cases (specifically for two people so far that happen to both have been running Chrome on MacBook Airs and with apparently not incredible but decent broadband Internet connections) the video plays staggered and choppy---which I believe to mean the video has not sufficiently buffered.
Are there better techniques for either ensuring that video is sufficiently buffered on most browsers?
Would using fetch() to buffer the entire video in memory probably do the trick? Or is a resulting blob() also actually lazily buffered behind the scenes?
Are there good practices for testing and debugging these sorts of issues given that I can't really replicate this locally?
I have built a tool called Stream or Not that might help on the network side. It will tell you how long the video takes to start, how many stalls, etc. You can use your browser's devTools to throttle the network (and in Chrome, you can emulate the CPU).
Honestly - to see if the network is the issue, as long as the bitrate (use FFprobe https://www.streamclarity.com/probe?url=) is lower than the network speed, you are not network constrained.
There is another possibility. What are the dimensions of your video? How what are the dimensions of the viewport on the browser? If you are asking the device to chop down a lot of pixels - the playback limit might move away from bandwidth to CPU processing speed. I have seen this happen on mobile devices and on older Macs trying to play 4k videos - there just isn't enough CPU to process that many pixels.
I'd test the network speeds, just to be sure.
Make sure you are not sending more pixels than you need. Underpowered devices will have issues.

what does "frames with intermittent jank" means in Chrome DevTools docs?

I got confused when I read Chrome DevTools |Timeline.
Keep in mind that just tracking the FPS counter may lead to you not noticing frames with intermittent jank. Be careful when using the content. It is also worth noting that FPS on desktop does not equal FPS on devices and special care should be taken to profile the performance there too.
does frames with intermittent jank means the frame that fail to use real hardware frame?
Apparently "jank" is a neologism:
"Jank is any stuttering, juddering or just plain halting that users see when a site or app isn't keeping up with the refresh rate. Jank is the result of frames taking too long for a browser to make, and it negatively impacts your users and how they experience your site or app."
From http://jankfree.org/
So "frames with intermittent jank" means ... well ... intermittent jank.
does frames with intermittent jank means the frame that fail to use real hardware frame?
Nope. You can't make that inference.

Flash Player behaviour when lost focus

I've got a real problem with FlashPlayer. What I need is to have it working on full speed when it is in the THROTTLE mode - that means when FlashPlayer loose its focus and decrease framerate to about 4fps. This is commonly know as a feature for mobile phones or when you change a tab in your browser and your .swf movie doesn't run with full speed anymore.
I need this full speed because we run tests with flash swfs on virtual servers, and unfortunatelly tests run very long.
I found that in FP11.2 was ThrottleEvent introduced that inform you what Flash Player is doing. It can go to PAUSE, THROTTLE, or RESUME state. Unfortunatelly it seems that I can't force other stage.frameRate when it really goes into any of these states. I tried also with Event.DEACTIVATE, and Event.ACTIVATE without any results.
Can I go around this any way? Or if not, what version of FlashPlayer was the latest before Adobe incorporated this feature into FP?
Thanks for any response!
Kindest Pawel
you should try
stage.addEventListener(ThrottleEvent.THROTTLE, doStuff)

Best HTML5 Video Format for Safari on Window (or getting VP8 to play in Safari on Windows)

Here's the deal, through a huge series of events, I am stuck using Safari on Windows for video playback in HTML5.
I can't use any other browser, Chrome is out of the question, I must use Safari and it has to be on Windows for hardware compatibility.
The best format I've found is a h.264 Quicktime file, but I'm still getting some frames dropped and a bit of tearing.
The video is being played in 1920x1080 resolution and I have tried down-sampling to 720p, which causes noticeable quality loss and no noticeable gain in performance.
I'm looking for one of the following two as a solution:
- A plugin for Safari (that's Windows compatible) to use something other than Quicktime for HTML5 video. I've looked and the WebM (VP8) plugin is only for OSX.
- Any video format configuration that will decode faster in Quicktime on Windows. I've even tried ProRes to no avail, it's even slower than h.264.
Update...
Ogg Theora can be played in Quicktime with XiphQT, but I've ran into many issues when trying to playback various Ogg video formats.
With h.264, if you are using x264 (eg: Handbrake) to transcode/encode video, the following can be set in advanced mode:
cabac=0:ref=1:me=umh:bframes=0:weightp=0:8x8dct=0:trellis=0:subq=6:tune=fastdecode
These parameters:
ref=1, set the reference frame limit to 1, using more reference frames requires more processing.
bframes=0, disables b-frames, not sure on this but I believe that forces P-frame which are faster
cabac=0, disables CABAC compression, which would make the output smaller but take more processing
tune=fastdecode, set's the tune preset to optimize the output specifically for decoding
The other options I am not as sure of and have yet to find solid evidence on their impact towards decoding, let alone if they have any impact on decoding. For example, the "me" setting is for subpixel strength in the transcoding process, it has an effect on video quality, but understanding how frames change, it could have an impact (in some videos) on the decoding process. That is something I do not know, but am stating for a better understanding of where I am coming from.
More about these settings can be found here:
http://mewiki.project357.com/wiki/X264_Settings

do html5 audio tags eat up resources

I have pages where i need to play dozens of small audio file when the user clicks on things. Responsiveness is very important.
I'm thinking of using one for each file, and preloading the audio files. Is this the a reasonible approach?
Thanks.
What I experience using SoundManager2 (audio Javascript lib) is that Chrome nor Firefox have no issues loading and playing multiple (100+) sounds through their Html5 capabilities
(Firefox must play OGG though)
With IE9 it's a different story. Looks like it has a limit to load and play no more than 40 sounds. :-(
As, the game we develop requires constantly to have 50+ sounds played within 1 minute period, we have to fallback to Flash for playing sounds on IE9.. luckily SM2 does it too
I also confirm this behaviour with html5 mode using jPlayer. I'm only able to create 40 instances of jPlayer. Each can preload and play sound that it defines.
41st and following instances will fail with an error on IE9/Windows7
Error: "Media URL could not be loaded"
It's reasonable, and probably the correct solution. I recently wrote a demo application (http://www.soundscribe.com) that makes heavy use of individual (and simultaneous) audio clips in HTML5. IE9 and FF3/4 handle it well. Chrome has some issues that seem to be specifically related to simultaneous playback (which probably won't apply to your app). The biggest block I hit was in IE9, which seems to have a mysterious limit on the number of audio objects that can exist at once. The max is about 40, after which IE9 will silently fail to download the file. FF and Chrome both try to support an unlimited number.
The alternative approach of putting all the audio in a single file and changing the offset to play is a bad choice for several reasons. It's much more complicated to code, you need to keep up with additional metadata (where does the clip start, how long is it), and it's likely to work slightly different between browsers. And the worst part, there's really no way to know when your clip is fully loaded. You can only tell when the clip "can play through", which is determined by the browser based on the size of the audio file and the current download rate. This means that even after the browser reports the audio clip is ready, you may not be able to play a clip somewhere near the end.
It seems like a reasonable approach. However you need to consider a couple of things.
Each sound clip will need to be held in memory. While this is will not matter for most cases, users with a lot of tabs open, multiple programs open or old computers may get slow down on their computer. Especially if the sound files are large.
From a usability point of view, if I hear a sound every time a click a button on the site, I'll leave immediately