In my encoder, I send RTMP stream to Ant Media Server but when I try to play stream in Chrome via WebRTC, chromes crashes.
Do you know what it happens?
Chrome can play I420 or yuv420p pixel format videos. Some encoders can be used yuv422p format in videos. Because of that Google Chrome can be crash.
So you have 2 choices for the solve this issue:
You should change your Encoder Settings to I420 or yuv420p
You can enable Adaptive Streaming in AMS Management Panel.
Related
I don't know if this is the right place to ask such a question, but I want to download an online video of my graduation. I tried to look up the source code and in inspect option in google chrome but I didn't succeed. Is there any way to download the video?
This page uses HTTP Live Streaming deploying the M3U8 file format. After obtaining the link to the M3U8 file, it can be downloaded and converted with other software libaries also supporting HLS (e.g. VLC media player or ffmpeg ffmpeg -i "https://….m3u8?…" output.mp4).
You can extract the .m3u8 URL using Chrome Dev Tools, which allow you to browse loaded resources in your browser:
Log network activity
Reload the page. The Network panel logs all network activity in the Network Log.
– Documentation › Chrome DevTools › Network
Once opened with F12, you can press F5 to reload all resources and use the search option to filter for m3u8:
I can run the Google Chrome with the fake webcam using this command:
$ google-chrome-stable --use-fake-device-for-media-stream --use-file-for-fake-video-capture=video.mjpeg
This works fine, but I can only use fake audio sources using this configuration. How to run Chromium/Google Chrome with fake video stream and real audio stream?
Also you can use any Python modules you want.
With --use-fake-device-for-media-stream it is not possible, Chrome will always use fake audio:
if (base::CommandLine::ForCurrentProcess()->HasSwitch(
switches::kUseFakeDeviceForMediaStream)) {
params_.set_format(media::AudioParameters::AUDIO_FAKE);
}
Use a virtual webcam and your real microphone instead.
A quick selection from searching:
Webcamoid, Windows/macOS/Linux
OBS Virtualcam, plugin for OBS Studio, Windows/macOS/Linux
Syphon Virtual Webcam, plugin for Isadora, macOS
I am using a web page which has a microphone incorporated, same like "Search by voice" feature of Google search.
In Developers tools -> Network I see that a .wav file is created which is sent to the engine for processing.
dev tools -> network:
Is there any way to manually send .wav files to Chrome for processing?(simulating audio recording)
I'm using Node.js as stream server to stream realtime Webm videos that is sent by FFMPEG (executed from another application, the stream is done via HTTP) and received by a webapp that uses the tag.
This is what I'm doing: FFMPEG streams the received frames using the following command:
ffmpeg -r 30 -f rawvideo -pix_fmt bgra -s 640x480
-i \\.\pipe\STREAM_PIPE -r 60
-f segment -s 240x160 -codec:v libvpx -f webm
http://my.domain.com/video_stream.webm
(the stream comes from an application that uses the Kinect as source and communicates with FFMPEG through a pipe, sending one frame after another)
When the webapp connects, it receives immediately this response from the server:
HTTP/1.1 200 OK
X-Powered-By: Express
content-type: video/webm
cache-control: private
connection: close
Date: Fri, 06 Dec 2013 14:36:31 GMT
and a Webm header (previously stored on the server, with the same resolution and frame rate of the source stream and tested as working on VLC) is immediately appended. Then the webapp starts to receive the data streamed by FFMPEG. Here is a screenshot of Mkvinfo GUI showing the fields of the header:
However, even if the Network tab of the Chrome console shows that there is an actual stream of data (meaning that what is streamed is not completely garbage, otherwise the connection would be dropped), the player doesn't display anything. We tried to manually prepend our header to the dumped video received by the webapp and VLC plays it just fine, but this is not happening with the tag.
What can cause this problem? Are we missing something about the encoding on the FFMPEG side or we stored wrong values on the header (or they're not enough)?
PS: I cannot rely on an extern stream server.
PPS: We tried the following experiments:
substituting the video header with the one stored in the server makes the video playable on both vlc and video tag
if we dump the video that is already started (without an header) and we prepend the video header stored in the server or even its original header, the video is playable in VLC but not on the tag (we're carefully prepending the header just before the beggining of the cluster).
There so many variables added to this problem when considering you're using a technology outside of(and not integrated into) node to stream your video. This could cause issues with the loadbalancer or proxy you are using, or it could be that you're hosting 2 applications on the same port.
Could you do the streaming in just node? Or could you even just stream ffmpeg to the filesystem and stream that with node.fs.readStream()? This would reuse the same webserver instead of spawning an entirely new server on the same box. And if youre just streaming that content from point to point, then you need to buffer the data coming through and forward the buffer as a stream through node.
The reason why technologies get integrated, wrapped, and extended into other frameworks is for reasons of uniformity. Reading your question, though its well detailed, it still leaves a lot aloof. This could lead into question about how ffmpeg converts and serves http content, and how your loadbalancer/proxy handles that. Does node have anything to do with this? Is there a replacement for ffmpeg so you can standardize around node's framework? Is node right for this applications?
A little bit of background: I am trying to create a Google Chrome that will play videos in your browser locally. Meaning it should be able to play without an Internet connection. Since I'm only using HTML/Javascript, I can only play webm, mp4, and ogg files. However, I am interested in playing other formats, such as AVI, MPG, and maybe a few others. I was thinking of creating a local server somehow where I can run ffmpeg to encode the videos to webm/mp4. So my main objective is to get ffmpeg to run. Is this possible?
Not just using html5 and javascript. If you have the option of running a local webserver than you can run any executable you want, including ffmpeg. You'll basically be calling a server side process (using php, java, or whatever) which happens to be running locally.
Download FFMPEG.EXE from here
Extract the FFMPEG.EXE from the archive.
Place the FFMPEG.EXE in the webdirectory.
As you all know how to upload a file (So i'm moving to the next step).
After the file uploaded just put the below line
exec("ffmpeg -i recipe_videos/$path -f flv recipe_videos/$test[0].flv");
//Format exec("ffmpeg -i pathofthevideouploaded -f towhichformat wherethefiletobesavedwithextension");
If you want delete the old file.
Now you can view the video in the desired format.
What you're trying to do is impossible. The browser prevents the server from accessing the user's computer (including running programs). You'll either need to transcode the videos on the server before serving them or have the user download FFmpeg and transcode the videos themselves.