Recording high quality video using Flash and Red5 Media Server - actionscript-3

I'm running a Video Recorder application (written in ActionScript 3.0) from my local machine. It records using a Red5 server which is installed on a remote Amazon EC2 server.
To record, I'm using the following settings
Width, height and FPS (for Camera.setMode()) - 1920 x 1080 and 10
Bandwidth and Quality (for Camera.setQuality()) - 0 and 80
Buffer time (for NetStream.setQuality()) - 3600
I'm able to record video till the buffer gets filled (I'm monitoring the NetStream.BufferLength constantly)
Once, the recording is stopped - the data in the buffer is sent to the server. And now, when I try to playback with (bufferTime = 1) The video doesn't appear.
I have ssh 'ed into the EC2 server and have seen that the file does get created in the red5/webapps/vod/streams folder, but I'm am unsure about its quality or whether it has recorded correctly or not. I've even used the command line based movie player mplayer to try and play the file, but it doesn't play because I'm guessing the Ec2 server Ubuntu lacks the playback plugins (not sure of this though.)
However, when it's a low quality recording with 640 x 480 instead of 1920 x 1080, the buffer doesn't get filled beyond 0.1 or 0.2, and the video plays back smoothly.
My Internet upload speed is around 300 kbps.
How can I (if it is possible), record and then playback high quality video?

You need to use
// Ensure that no more than 43690.6(43K/second) is used to send video.
camera.setQuality(43690.6,0);
This works for me. I used Amazon EC2 extra large instance.

Your issue stems from these 3 causes happening simultaneously:
recording high quality video which results in the data having to be buffered locally
Flash Player buffering just the video data (good for when doing live streaming)
Red5's buggy mechanism for dealing with video data coming at the end of the recording
Red5 has been plagued by many recording issues. This HDFVR documentation article covers Red5's various recording issues and the mechanism for coping with the FP buffer when recording over slow connections.
The media server needs to account for this by waiting more for the video packets and sort them together with the audio packets before writing the data to disk (.flv file).
Red5 0.8 had no such mechanism thus recording high quality video over slow connections resulted in low quality/scrambled video files (just audio, all video at the end).
Red5 0.9 had audio video recording totally broken.
Red5 1.0 RC1 had a new delayed write mechanism - controlled by in Red5/conf/red5-common.xml - that waits for the audio and video data and rearranges the packets before writing them to disk. The queueThreshold value measures rtmp messages/packets.
Red5 1.0 final, 1.0.1 and 1.0.2 had the delayed write mechanism totally broken. With it turned on, over slow connections, Red5 was producing .flv files with only 1 or 2 video keyframes. When playing back such .flv files the video would get stuck from the 1st second and playback would continue only with audio. Using yamdi to extract the keyframe data confirmed that the .flv files were lacking video keyframes.
However, thanks to HDFVR's code contributions to Red5, Red5 1.0.3 and later has video recording over slow connections fixed.

Related

WebRTC native sends packets very slow

I am working on a streaming server based on WebRTC native. For this project, I've hacked WebRTC native source code (version M60, win10 x64) to be able to feed it a pre-encoded H264 bitstream (1080p25, all frames are encoded to I frame). By default, WebRTC use 42e01f as the h264 profile, I changed it to 640032(level 5) to support 1080p. In the h264_encoder_impl.cc, I commented the encoding part, just copy the bytes from the input frame to the buffer of encoded_image_, and generated the fragment information.
It is working, but the speed of sending packets to client (Chrome) is very slow (about 2~3 fps). If I limit the feeding speed to 12 fps, it is working well.
I spent a lot of time to debug the code, what I found is the speed of sending packets in paced_sender.cc is slow, so the packet queue soon will be full, and then the encoder will be blocked and stop putting new packets into the queue until the queue is not full. I tried to remove the bitrate limitation in paced_sender.cc, the sending speed is still slow.
I also checked graphs in the Chrome WebRTC debugging page (chrome://webrtc-internals) to check if the problem could be on the receiver side, the decoding only costs about 2 ms per frame, the rate of receiving frames is about 2~3 fps, no packet is lost.
PS. the LAN is 1 Gbps.
After debugging for days, I have still no idea why the speed of sending packets is so slow. The h264 bitstream is encoded to all I frames, it could be a problem?
Any reply will be appreciated, thanks in advance!
Answer my own question: build WebRTC under release mode.

How does Chrome handle videos stored in memory?

Im building an application with Electron. My application has hundreds of videos that are activated using the basic HTML5 video play and pause functions. The app sometimes uses as much as 8gb of memory. This is fine for me because I have 16gb of ram, but I am unsure what would happen on a computer with less ram.
Would my application crash on a system with less ram, or does Chrome automatically delete videos out of the memory to make space? If so, how does it choose which videos to delete? Is this what is known as "garbage collection"?

HLS - how to reduce delay?

Anyone know how configure the HLS media server for reduce a little bit the delay of live streaming video?
what types of parameters i need to change?
I had heard that you could do some tuning using parameters like this: HLSMediaFileDuration
Thanks in advance
A Http Live Streaming system typically has an encoder which produces segments of a certain number of seconds and a media server (web server) which serves playlists containing a list of URLs to these segments to player applications.
Media Files = Segments = .ts files = MPEG2-TS files (in HLS speak)
There are some ways to reduce the delay in HLS:
Reduce the encoded segment length from Apple's recommended 10 seconds to 5 seconds or less. Reducing segment length increases network overhead and load on the web server.
Use lower bitrates, larger .ts files take longer to upload and download. If you use multi-bitrate streams, make sure the first bitrate listed in the playlist is a little lower than the bitrate most of your users use. This will reduce the time to start playing back the stream
Get the segments from the encoder to the web server faster. Upload while still encoding if possible. Update the playlist as soon as the segment has finished uploading
Also remember that the higher the delay the better the quality of your stream (low delay = lower quality). With larger segments, there is less overhead so more space for video data. Taking a longer time to encode results in better quality. More buffering results in less chance of video streams stuttering on playback.
HLS is all about trading quality of playback for longer delay, so you will never be able to use HLS for things like video conferencing. Typical delay in HLS is 30-60 sec, minimum in practice is around 15 sec. If you want low delay use RTP for streaming, but good luck getting good quality on low or variable speed networks.
Please specify which media server you use. Generally speaking, yes - changing chunk size will definitely affect delay time. The less is the first chunk, the quicker the video will be shown in the player.
Actually, Apple recommend to divide your file to small chunks this equal length of file, integers.
In practice, there is huge difference between players. Some of them parse manifest changing this values.
Known practice is to pre-cache in memory first chunks in low & medium resolution (Or try to download them in background of app/page - Amazon does this, though their video is MSS)
I was having the same problem and the keys for me were:
Lower the segment length. I set it to 2s because I'm streaming on a local network. In other type of networks, you need to be careful with the overhead that a low segment length adds that can impact your playback quality.
In your manifest, make sure the #EXT-X-TARGETDURATION is accurate. From here:
The EXT-X-TARGETDURATION tag specifies the maximum Media Segment
duration. The EXTINF duration of each Media Segment in the Playlist
file, when rounded to the nearest integer, MUST be less than or equal
to the target duration; longer segments can trigger playback stalls
or other errors. It applies to the entire Playlist file.
For some reason, the #EXT-X-TARGETDURATION in my manifest was set to 5 and I was seeing a 16-20s delay. After changing that value to 2, which is the correct one according to my segments' length, I am now seeing delays of 6-10s.
In summary, you should expect a delay of at least 3X your #EXT-X-TARGETDURATION. So, lowering the segment length and the #EXT-X-TARGETDURATION value should help reducing the delay.

How to play FLV video at "incoming frames speed" (not video-internal) coming from NetStream in ActionScript

How to play NetStream frames immediatly as they arrive without any additional AS framerate logic?
I have recorded some audio & video data packets from RTMP protocol received by Red5, and now I'm trying to send it back to the flash client in a loop by pushing packets to NetStream with incrementing timestamp. The looped sequence has length of about ~20 sec nad is build from about ~200 RTMP packets (VideoData/AudioData)
Environment: both Flash client and server on localhost, no network bottleneck, video is H.264 encoded earlier by same Flash client.
It generaly works, but video is not very fluent - there ale lot of freezes, slowdowns and long pauses. The slower packets transmitting causing the more pauses and freezes., even extreme long pauses like transmiting whole sequence 2x-3x times (~60 sec) without effect - this comes up when forwarding slower than ~2 RTPM packets per second.
The problem looks like some AS-logic is trying to force framerate of a video, not just output received frames, so one of my questions is does AS looks for in-video-frame fps info in live streaming? why it can play faster, but can't play slower? How can I play video "by frames" not synchronizing video fps with RTPM packets timestamps?
On the other side, if I push packets faster than recorder, the video is just faster but almost fluent - I just can't get slower or stable stream (still very irregular speed).
I have analysed some NetStream values:
.bufferLength = ~0 or 0.001, incrasing when I forward packets
extremaly fast (like targeting ~90fps)
.currentFPS = shows real FPS
count seen in Video object, not incoming frames/s
.info.currentBytesPerSecond = ~8 kB/s to ~50kB/s depending on
forwarding speed
.info.droppedFrames = frequently incrases, even if I
stream packets like 2/sec! also jumps after long self-initiated-pause (but buffer
is whole time 0!)
.info.isLive = true
.info.dataBufferLength = same as .bufferLength
It looks like AS is dropping frames, because of too rare RTMP packets receive - like expecting that they will arrive with internal-frame-encoded-fps-speed.
My currently best NetStreamconfiguration:
chatStream.videoReliable = false;
chatStream.audioReliable = false;
chatStream.backBufferTime = 0;
chatStream.bufferTime =0;
Note that if I set bufferTime to 1, video is paused until gathering "1 second of video" but this is not true - buffering is very slow, like assuming that video has FPS of 100 or 200 - even if I'm forwarding packets fast (like targeting ~15fps without buffer), the buffer is filing about 10-20 seconds.
Loop, of course, starts with keyframed video data and keyframe interval of sequence is about 15 frames.
Have you tried netStream.step(1)
http://help.adobe.com/en_US/FlashPlatform/reference/actionscript/3/flash/net/NetStream.html#step()
Also I would remove 'live' from the play() call.
Finally, maybe flash tries to sync to the audio like it does in regular timeline. Have you tried a video without audio channel?

Why am I missing frames while recording with Flash?

I'm recording video from a webcam with Flash, saving it on an Adobe (Flash) Media Server
What are all the things that can contribute to a choppy video full of missing frames? What do I need to adjust to fix this, or what can contribute to the problem?
The server is an Amazon Web Services Medium (M1) EC2 instance. 2 ghz processor, with 3.75gb RAM Looking on the admin console for AMS, The server is never getting maxed out in terms of RAM or CPU percentage
Bandwidth is never over 4mb.
The flash recorder captures at 320x240 at 15fps
I used setQuality(0,100) on the camera. I can still make out individual "pixels" when viewing my recording, but it isn't bad
Server has nothing to do with this. If the computer running the Flash file can't keep up, then you get dropped frames. You basically have 1000/stage.framerate ms to run every calculation for every frame. If your application is running at 30 fps, that is roughly 33ms per frame. You need to make sure everything that needs to happen on each frame can run in that amount of time, which is obviously difficult/impossible to do with the wide range of hardware out there.
Additionally, 15fps itself is too low. The low-end threshhold for what the human eye can perceive as motion is around 24 fps. So at 15fps, you will notice choppiness. Ideally, you want to record at 30fps, which is near the is about where the human eye stops being able to distinguish individual frames.