H264 over RTP - Blurred Section of Video - h.264

I've got an packetizer which is taking NAL units from an H264 stream and sending them out over RTP using UDP.
Everything works, but the top ~15% of the resulting video when viewed in VLC or FFPlay is blurred (screenshot linked below).
The stream uses FU-A but this matches the spec in RFC6184.
If I save the nal units to a file before the packetization it plays perfectly without the blur.
I've spent quite a while trying to work out what's going wrong with it and have run out of ideas, so was hoping someone here might have some pointers to help me out?
Any help very much appreciated.
Blur at top of screen

Related

Wrong duration HTML5 audio on mobile browsers, works fine on regular browsers (m4a file)

I am using Savedeo APIs to get to youtube audio files and trying to play the audio on a browser.
When I try to play an audio only file (.m4a) on regular browser, everything works fine. But, when I test it on a mobile browser on my iphone (safari and chrome), the audio file's duration is doubled. Basically after the end of the audio, there is a padding added.
Why is this happening? Is there any work around for this. If you need more info, will be happy to provide.
Thanks
I ran into this issue with a MP3 with a sample rate of 44100 Hz and a 128kb bitrate. The solution is to change the sample rate with ffmpeg:
ffmpeg -i your.mp3 -ar 22050 your_fixed.mp3
You can retrieve critical information about an MP3 by using ffprobe:
ffprobe your.mp3
iTunes also reports the wrong duration when calculating the duration with the actual MP3 file. That's embarrassing since you would expect iTunes to get something like that correct. This indicates a bug exists in some library that Apple uses for MP3 duration calculations.

How can I zoom into video and switch streaming of videos in the same HTML5 player?

I have video that will be divided into 4 videos.
First the player will stream a lower resolution of the original video, then the user can zoom into the video to see more details, I need the player to stream one of the 4 videos - that's higher in resolution- based on where the user zoomed in.
How can I make that using VideoJS or any other video player ?
After searching, this is the answer ...
For zooming into the video, you can follow this tutorial: Zooming and rotating for video in HTML5 and CSS3
For switch streaming of videos in the same player, you can make that by changing the source on html5 video tag and make some calculations to know where the user zoomed in and hence change the source video.
As there is no response yet let me analyse the problem. This is by no means meant as a full answer, but other people will probably be able to answer parts of the problem:
First the player will stream a lower resolution of the original video,
This means you will need to create/use a video stream. There are plenty of plugins you can use for videostreaming, and depends on what you want. You can consider writing it yourself using for example C#'s System.IO objects and transforming the video in bytes(And putting it back together) The resolution would be easiest reached by just having a seperate video file for this step of the proces. (a lower resolution one used for streaming only)
then the user can zoom into the video to see more details, I need the player to stream one of the 4 videos - that's higher in resolution- based on where the user zoomed in.
So you need to trigger a zoom effect. This means you would need to detect zoom. This would be possible with Javascript in a webbrowser, if you want a browser based application. When that zoom is triggered you can retrieve what position the mouse is on the screen/in the div or on some sort of overlay. Depending on this position you could show another stream.
How can I make that using VideoJS or any other video player ?
Basically these steps above is how i would start looking into this specific case. Considering your VideoJS as a suggestion i assume this is browser based. This would probably mean using Javascript libraries, maybe combined with a server side language.
Thats as far as i can go. Maybe someone can pick up specific parts of the thing i wrote and help you a step further.
Have a nice day!

Determining current rendition for HTML5 HLS streams

I've got an HTML5 <video> element whose source is a .m3u8 (HLS stream)
I have an M3U8 with three different renditions: 640x360, 960x540, and 1280x720
On Desktops I have a Flash Player for playing the video, so the HTML5 fallback is only intended for mobile (iOS and Android) - I am doing all of my testing on an iPad and, once it's working, I will try it out on Android and hope everything works the same.
My goal is to, at any point in time, figure out what rendition the video element is playing. The rendition is subject to change as the user's bandwidth changes.
I tried using the .videoHeight property, but it always returns 480 regardless of the rendition being downloaded - which is particularly odd because 480 isn't even an option.
Does anyone know how I can figure out the rendition being downloaded?
Cleaning up some old questions that never received answers:
Unfortunately this one is just not possible. The HTMl5 video spec and HTML5 video implementations in most browsers are intended to abstract away all of the underlying magic involved in playing videos. You give it a source, it plays. Everything else is completely hidden and you have no access. No access to metadata channels, no access to audio channels, no access to bitrate and resolution information,...
At best I developed a solution to guess which resolution was playing. Every 10 seconds a 1 MB file was loaded over AJAX. I measured the speed at which this downloaded to guess at their current bandwidth. I know that QuickTime will only play a rendition if you have double the required bandwidth. So if the 960x540 rendition requires 1400 kbit/s then it won't play unless you have 2800 kbit/s bandwidth.
It's not very good (and wastes 6 MB of bandwidth per minute) but it's better than nothing.

Play HTML5 Audio immediately without waiting for the entire buffering to complete?

I have a very fast connection and it takes about 2-3 seconds before the song actually starts playing. It's a relatively average 128kbps MP3 size (3mb-4mb). I have set preload="auto" but that didn't help much. Is there a way to just start playing the audio right away and continue to buffer it (sort of like YouTube does)?
Here is an example that I am currently working on. It's going to play an audio simultaneously on all connected clients. So if you have 2+ laptops, you can try it out. All computers must be connected before you start playing the audio. (double click on a song to start playing).
Running video and audio without complete buffering is called smooth / adaptive streaming. It can be achieved in players like silverlight and flash.
What it actually does is to create chunks of files and let the user play file chunk by chunk. Since you are downloading chunks, it will not require whole file to download.
Well, I am not giving you the full fledged answer since I haven't studied much but I am giving you the exact idea of how it works.
I had same issue but with HTML5 Video.. I overcame it with using Smooth streaming media Azure..
Here is a tutorial of the same : http://www.wrapcode.com/featured/windows-azure-media-services-mp4-to-smooth-streaming/
I will keep you updated once I find something useful :-)
If you use preload=none, then you have no buffer at the beginning but it will buffer your content "on the fly"
I have an Icecast server which streamsmy contet, and when I use pause and play, it buffers my content, even with preload=none.
Do not use preload=auto. It will take some time to start.

HTML5 video size vs poster size is wrong in Chrome

I used Miro Video Converter (http://www.mirovideoconverter.com/) to convert a .mov to different html5 video formats (webm, ogv and mp4). I manually made a poster image from the 1st frame of the video.
Everything works fine in Firefox and Safari, but when using Chrome, the video seems to be a little bigger than it should be. It's easy to notice when looking at the difference between the poster image and the beginning of the video.
I am wondering if someone has ever encountered the same problem and if the problem here is either my video file or the embedding.
Here is a jsfiddle with the actual files : http://jsfiddle.net/aLvpP/
Ok so, if anyone ever comes here, I finally found the answer to this.
The reason why Chrome was showing a bigger/blurry video is that the mp4 version of it was not in a standard 16:9 format, but not too far from it, so it was kind of stretching it a bit.
Actually, part of the blame is on Miro Video Converter, because when I tried using FFmpeg to convert my original mov to the mp4 format, FFmpeg would not let me do it, explicitly telling me that "the width of the video can not be divided by 16".
All in all, changing the format of the original video to a standard 16:9 format solved the problem.