Streaming realtime video with alpha layer - h.264

I am working on a project that converts RGBA images to h264 (NAL), streams them to a client browser, decodes the frames and displays them on a canvas using Broadway.js .
My current problem is that when my images contain data in the alpha layer, it is lost in the video stream.
It seems that h264 is not capable of embedding anything alpha layer. I have tried h265 but the encoding times were doubled.
Are there any alternative codecs which are fast enough for realtime streaming but do support an alpha layer? Or a way to have an alpha layer in the h264 stream.

Related

live555 H.264 trick play

Anyone know if live555 RTSP server can correctly reverse play h264 video?
I made index file for video using native MPEG2TransportStreamIndexer by
live555 and trying to reverse play with speed -1 or -2 but video freezes for many seconds, jumps to another frame and continue freezes.
I don't think mpeg2ts support reverse play. And it's just a transport layer. Contained media stream should support reverse play. MJPEG will be most simple media format can be played forward / backward.

Playing video with alpha channel in AS3 (VP6 On2)

I want to display a video with alpha channel. I've found an old article that describes exactly what I want to do, and says that it is possible with VP6 On2 codec, which tends to be correct according to adobe site:
The On2
VP6 codec provides:
Higher quality video when compared to the Sorenson Spark codec encoded at the same data rate
Support for the use of an 8-bit alpha channel to composite video
The designer I am working with was able to create such a video in AfterEffects, but when I play it flash player does nothing: no errors, no log entries - it just silently works without drawing a thing. However when I asked the designer to encode the video without alpha channel flash played it perfectly.
The code I am using is pretty straightforward:
var flvPlayback:FLVPlayback = new FLVPlayback();
flvPlayback.addEventListener(MetadataEvent.METADATA_RECEIVED, onMetadataReceived);
flvPlayback.width = 300;
flvPlayback.height = 300;
flvPlayback.play("http://192.168.0.102:9998/assets/video/test.flv");
I am developing in IntelliJ IDEA, using Flex SDK 4.6 and FLVPlaybackAS3 component from Flash Professional 2015.
Do I understand correctly that Flash dropped support for FLV videos with alpha channel? If yes, then is there any other alternative that is production-ready?
P.S. I am aware of producing such an effect by combining two videos' output to bitmap (where one video contains RGB data and other contains mask as RGB), but it doesn't produce steady FPS on an average hardware.
Try this code:
flvPlayback.alpha = 0.2;
I have checked this with FLVPlayback 2.5 in Flash Player app

How does Chrome decide how much video to buffer for HTML5 MP4?

I have an MP4 video that is variable bitrate, so the average bitrate doesn't necessarily stay consistent throughout the entire file. Because my video is a capture of a computer screen, some parts of the video are very low bitrate because nothing is happening, and other parts are a much higher bitrate because there's a lot of activity on the screen.
How does Chrome decide how much video to buffer for progressive download HTTP(S) videos? I'm running into a problem where Chrome tends to buffer too little, so playback stutters.
If there's no way of convincing Chrome to download a certain time of video (and I don't want to just preload the entire thing), can I author the MP4 some special way to solve the problem? I'm using FFmpeg and MP4Box. Maybe it's up to the HTTP server?
If you want more control over the playback of the video, you should definitely check out MediaSourceExtensions. They define a whole model for video playback, defining sourceBuffers where you can put video data, etc.
Beware it is not a simple to use API still, and the information on how to use it is very fragmented.
In your case, if you go the MSE route, you can either keep using h264 (which is probably the codec your mp4 is wrapping) or switch to webm.
In case of going the MP4, h264 route, you'll need to generate a fragmented MP4 (fMP4) and write some JavaScript to control the way you work with the MP4 fragments (segments in MSE parlance).
MP4Box has support for the -dash protocol, which will segment an MP4 in a way that is suitable for consumption via MSE. FFmpeg also has support for generating fragmented MP4.

Accurate Seek in Pseudo-streaming Flash

I would like to know if the flash fallback (medialement or any other) can support accurate seek to the frame for an H.264 video in a MP4 container, transferred by HTTP using an IIS or apache web server (HTTP Pseudostreaming or Progressive Download). In HTML5, we can order to seek to any frame number and it will manage to decode whatever is needed to display the requested frame index.
Currently, what I can experiment with all flash player is something like a seek to the closest GOP or keyframe. However, this is unfortunately not enough for my project as I really need to be able to reach any arbitrary frame.
Do you know if there is a way to have an frame accurate seek in Flash, using HTTP pseudo streaming (I know this is possible with FMS and RTMP streaming).

How to force the NetStream to create a keyframe?

I created a video stream recording application that works fine except the recorded FLVs are corrupt a little bit. :) If I open an FLV in VLC player everything is green but getting "clean" when changes occur. And especially at the beginning of the video is breaking up. (I use Red5 1.0)
For pre-recorded streams, the keyframes are already encoded into the file and they cannot be changed. If you're serving a live stream, the keyframes need to be set in the application that is encoding the live stream (for example, Flash Media Live Encoder).