Real time joining base64 .webm video chunks and play without delay or flicker isues - html

I'm trying to make a live streaming system without P2P connection by nodejs.
Video chunks are recorded each 3 seconds and sent to server via webrtc and they save simply as .wemb base64 files.
But when the client is trying to see the whole video, there is flickering problem.
Each time a 3 secs video ends the src attribute of video tag changes to next chunk.
I tried to save some upcoming ones to a buffer in order to prevent delays, but still it is considerable.
I used MediaSource API to add source buffer but id didn't work on that specific webm encoding either.
The codes are huge and complicated and doesn't help much, but if they are necessary in order to correct the concept I can put then too.
Thanks.

Related

HTML 5 <video src="path/to/file.mp4" ...> video file is loading very slowly, sometimes never

<video id="mash-video-player"
src="videos/xyz.mp4"
width="100%"
height="100%"
controls
preload="metadata">
</video>
When the selected video is "clicked", the above HTML is dynamically built and injected into a "video player" and played back. The contents of the are replaced every time a different video is clicked.
There are many videos in the folder that get listed for playback. The first one clicked (not the first one in the list), usually gets displayed and played almost instantly. From there on, it's a roll of the dice. Click on the next one and it might take 15 seconds. Might not ever be able to be played. Even if the same exact video file is clicked.
The original video files are around 100-300MB. I processed them down to 40-80MB. The video file size doesn't matter. The slow behavior is exactly the same.
One strange side-effect of this is that when it comes to refreshing the page to restart the webapp, the reload can take just as long or sometimes never get reloaded until I reboot the web server.
There are several stacko questions out there that have similar problems. I have reviewed and tried sensible solutions.
Encoding might be a problem, but these are MP4 video files, not very big.
% file xyz.mp4
xyz.mp4: ISO Media, MP4 v2 [ISO 14496-14]
Some of the solutions talk about the metadata being at the end of the file. So there is qtfaststart, now defunct and replaced by qtfaststart2. This python app attempts to move the metadata to an optimal location. Apparently all my videos are "optimal". This is not a solution.
Other solutions talk about caching and such. But even though 300MB might take some time to shove through a python HTTP server, it shouldn't take more than a few seconds. Ultimately that might be the core of the problem: using a python HTTP server.
As a work-around, I am running the video in a profoundly old-fashioned window.open() context, but it works as expected: videos load fast and clean using a "file:///videos/xyz.mp4" URL.
Hardware/Software:
Hardware: MacPro 3.7GHz Xeon, 32GB ram
Browser: Brave Version 1.46.134
Server: Python 3.9.2, HTTPServer, SimpleHTTPRequestHandler
Client and server are both running on the same machine.
Any and all suggestions welcome.

playing a growing mp3 file on a web page

need your advice.
I have a web-serivice which generates mp3 files out of wavs.
The process of conversion takes time and I want visitor to be able to start listening right away, while the conversion is still going on.
Having tried the html5 tag I found that I can only play the part of mp3 file which was ready at the moment the mp3 file was fetched. I mean, it doesn't seem to care that the file might have grown since it was fetched.
What is the right way to approach this situation?
Thanks in advance for any info.
I believe that you can use JPlayer to play them. One of the features is that it does not preload.
EDIT : The
HTML5 audio player also has the preload attribute which can have the following values:
"none": will not prefetch anything
"metadata" (not sure if it's the correct word): will get some basic stuff like length, sample rate
"auto": will prefetch the entire mp3
You need to get a bit more control over the serving process, rather than just leaving it up to your web server.
When your web server responds to an HTTP request, it includes a Length: header that tells the client how big the requested resource is, in bytes. Your web server will only send up to the length available at the time of the request, because it doesn't know the file is about to be appended to. The client will download all of that data, but from the client's perspective, it will have downloaded the entire file when really the file wasn't even done being encoded yet.
To work around this issue, you need to pipe the output of your encoder to both a file, and your client at the same time. For the response data to the client, do not include a Length: header at all. Most clients will work with chunked encoding, allowing you to be compliant with HTTP/1.1. Some clients (early Android, old browsers, old VLC) cannot handle chunked encoding, and will just stream the data as it comes in.
How you do this specifically depends entirely on what you're using server-side, which you didn't specify in your question. Personally, I use Node.js for this as it makes the process very easy. You can simply pipe to both streams. Be careful that if you use the multiple pipe method, the pipes only run as fast as the slowest. Some streaming clients (such as VLC) will lower the TCP window size to not allow too much data to have to be buffered client-side. When this occurs, your writes to disk will run at the speed of the client. This may not matter to you, but it is something to be aware of.

Possible to buffer video in Windows Phone 8?

Is there anyway to buffer video in a Windows Phone 8 app?
I want to create an app that buffers the last 30 seconds or so of video so that the user can tap the screen and get a video file that includes the 30 seconds of video taken prior to their tapping the screen.
I've looked at both the .NET CaptureSource API, and the WP8 only AudioVideoCaptureDevice, both look like they record directly to a file on IsolatedStorage:
For CaptureSource you use a FileSink object to write an mp4 file of your recorded video.
For AudioVideoCaptureDevice you can write to a RandomAccessStream. WP8 doesn't have the InMemoryRandomAccessStream though, so the only way I see to get a RandomAccessStream is to create one from a storage file.
For CaptureSource you could write your own VideoSink class to buffer your video and use that instead of FileSink, but then you you would be stuck working with the Raw video data, and you'd have to write your own encoder to get it into a formal like an mp4.
Is there anything I'm missing, or is buffering video just not possible on WP8 unless you write your own encoder?
I'm not sure you can do This... for various reasons... Maybe you can cache video in memory, making your own implementation of IRandomAccessStream but... as you noted, you need to play in first instance with RAW video, and depending resolution, 30 seconds of raw video and audio can weight more than the total allowed memory for the application so you can had your app closed by the system.
I don't know if you can use a mediaelement to play the video Without showing it to the user and when the user click on play, rewind to Start position and show it to the user, as OS automatically cache streamed videos (This is a happy idea... i don't test This in anyway....)
Sorry for not begin more useful :(

Is it necessary to convert database of mp3's to ogg and wave to use html audio?

I have thousands of mp3 files in a database and a website that lets people listen to them. I'm using a flash player but want to move to html5 audio player. Does this mean that I need to make ogg and wave versions of all my audio files? What is a good approach to making these files more accessible?
In short, yes you need to support multiple formats. (Assuming you care about decent browser support.)
If you are lacking disk space, don't get a lot of traffic, and don't mind some delay before the data gets to the user, you can convert these on the fly. Just write some code so that on request, it checks the conversion cache to see if you have already converted the file. If not, convert it on the fly (with something like FFMPEG) and write the data to disk at the same time you are writing it to the client.
As Imre pointed out, browser support is changing all the time. Look up what codecs are supported from time to time.

Live Audio Streaming to a browser methods, must be very simple

I'm recording a mono audio stream using a PIC at 8-bit 8Khz and streaming it raw to another microprocessor that houses a web server. I'm currently buffering the data and turning it into a wav file that gets played in the browser. What I'd like to be able to do is continuously stream the audio as it's being recorded without putting a lot of encoding overhead on the second processor. I've been searching but most searches turn up just streaming from a stored file, but since the file size isn't known ahead of time I'm not sure how to do this without the overhead of mp3 encoding.
You may find that simply creating a WAV file (or other raw format) that keeps growing will, in most players/browser plugins, cause the file to act as a live stream. This is, I believe, basically how like Ogg streaming and similar works. Because the player begins playing before it is done downloading anyway, it keeps playing and downloading until the end of the file, but the file has no end so it just keeps going.
Vlc media player can stream flv and many other formats.