How to embed streaming rtsp media into an html5 page - html

I have a security cam that sends via rtsp, which I'm able to capture on vlc player, but I want to embed that into my webpage. I've been searching for hours on how to do this, but have failed to find any recent documentation on how to do this.
I am not set on vlc either, so I'm basically trying to go from cam -> rtsp -> player (if required) -> html embed.
Any help would be appreciated. And I know this is an open question, but I'm failing to find what I need on the net, so I'm open to any solutions.
With that said, I'm not looking for 3rd party providers to send the stream to me. For security reasons, the stream will not exit the compound.
Please do not send me old links to old articles either. I have scoured and probably read them already, and my experience is that things have changed. I'm looking for some answers from people who have experienced similar issues and been able to resolve them. Thanks!

I. Open VLC and select "Open Network Stream" via the Media menu.
II. Input your IP camera's RTSP string (credentials included) i.e
rtsp://test:test#192.168.0.37:554/cam/realmonitor?channel=1&subtype=1
which would be for my IP camera.
III. Click the down arrow next to the Play button and select "Stream".
IV. For the destination set it to "HTTP" then select "ADD". In the port field this is where you can set what port VLC uses to stream the video. In this example I used 8080. The path you can leave as "/".
V. Check the box for Activate Transcoding and set the profile to "Video - Theora + Vorbis (OGG).
VI. Click the Screwdriver + Wrench icon, set encapsulation to Ogg/Ogm, the video codec to "Theora" then set the bitrate to what you want to broadcast the stream to your site at (for what its worth I simply use the same bit rate as I am having the camera stream at. In addition you can also set your framerate
VII. Using the sub tab "Resolution" you can use "Auto" for scale, width, and height. You can disable the audio codec if you camera does not have a mic or do not want to broadcast the audio, & disable subtitles. Finally click "Save" then "Next".
VIII. Check the box for "Stream all elementary streams" and then click "Stream". Keep in mind VLC will show a black box where video would normally be which is intended. You should see the video timer moving just above the Pause/Play button.
IX. Then drop this code into your page:
<video id="video" src="http://IP_of_VLC_computer:VLC_Port" autoplay="autoplay" width="videowidth" height="videoheight"></video>
One of mine is as follows:
<video id="video" src="http://192.168.0.4:8080" autoplay="autoplay" width="704" height="480"></video>
X. Load your web page to see what the video looks like. Do not be concerned if you see what looks like a green screen. Just refresh the page every 5 secs or so to force the page to update the stream. That is common with RTSP video transport.
To sum it up you are turning your PC into a transcoder by way of VLC to spit out RTSP video that is HTML5 friendly.
I uploaded a 1min 46sec video to youtube to show you how to complete this process:

Related

Media Recorder Chrome; Opentok audio not captured

looking for assistance from anyone who might have some insight into this. having a problem with the MediaRecorder API built into Chrome, specifically on macOS, a problem that does not appear at all on windows
i am needing to capture the desktop screen as well as the desktop audio, and currently i get the stream using:
navigator.mediaDevices.getDisplayMedia({
video: {
cursor: 'always',
width: 1280,
height: 720,
frameRate: {
ideal: 12,
max: 15
},
},
audio: true
})
this of course gives the prompt for the user to select a screen/application/tab
following that, i load that into a new MediaRecorder:
new MediaRecorder(screenCaptureStream, {mimeType: "video/webm"})
on windows this all works fine. as long as the "Share Audio" checkbox on the bottom left is marked, audio will be heard in the ending file.
on mac however, the only time a "Share Audio" option is allowed is when a chrome tab is selected. this is fine for something like a youtube tab playing music, it captures both video and audio without a problem.
however, on a tab running an opentok video session, no audio coming from the video session is captured. at first i thought it may have been ffmpeg incorrectly processing it, but simply playing back the raw webm file gives no audio. checking with mediaInfo the file does indeed come back with an audio stream at "48.0kHz, 32bits, 1 channel, Opus"
because this isnt a problem on windows, i imagine that the Opentok video session is somehow outputting in such a way that it it bypassing the chrome tab altogether, so that even is the user can hear the audio, the recorder doesnt capture it.
does anyone know how to allow chrome to record audio on the Application/Screen choices on macOS?
EDIT: upon more experimentation, ive discovered that the issue lies in tokbox streams themselves.
setup: using a second pc, i hooked up my phone to a line-in port to use as the "mic" for the tokbox stream, and started up the session again. on that same page i have a youtube embedded iframe link to a random video, and on firefox, another video entirely.
selecting entire screen and making sure "Share Audio" is selected, i check that i can hear the audio being played from the phone on the other pc and its fine. the recording is going, and the checks for if there is an audio track pass. i swap to the next audio source by muting the 'phone' stream, and start playing the youtube embed. after a little bit, i stop it, and play the firefox video.
the results are that the embed and firefox audio got captured, but NONE of the tokbox audio got captured. (this was done on windows, so now i know its not a mac issue)
where is it being outputted?

HTML5 web audio seekTo for buffered source

I have web application(similar to Karaoke) where user can record his voice over instrumental.
After recording user plays back recording. Here I play instrumental in <audio> tag and voice using web audio api. To sync both audios on play/pause I calculate time like this
pausedAt = Date.now() - startedAt;
startedAt = Date.now() - pausedAt;
This works fine. Issue is when user uses slider on audio tag to move forward/backward. I am thinking of solution like this
Use ontimeupdate event, stop the voice and then use startAt(currentTime) where currentTime is of instrumental playing in audio tag.
Since there is no seekTo function in api, I have to stop and then start audio. Is there any better solution for this?
Second issue I face is seeking on audio tag is not smooth. If I arbitrarily clicks on progress bar sometimes it doesn't work. When I saw network tab in developer tool window I saw something like shown in image. It sends out some 600 requests and some 86 MB data downloaded whereas file size is less than 10 MB.
You really should use the Web Audio API to do this. will never give you precise synchronization, and it will rely on streaming to seek - which is going to result in extra downloading, as you saw. Just load the song via XHR and decodeAudioData, and provide your own playback controls.

Why does dash.js fail to play this MPEG-DASH stream in Chrome?

This MPEG-DASH stream
http://54.241.9.147/new-fandor/vod/21/2157/dark_star_FILM_v11.smil/manifest_mpm4sav_mvlist.mpd
doesn't play in dash.js -- it plays the first segment at the lowest bitrate, switches to the next higher bitrate and stops after loading the second bitrate's init information. You can see this by pointing Chrome at the dash.js reference player, entering the stream URL in the top box and hitting Load. Open the JavaScript console to see that dash.js reported a media error, which means that the video element had a .error.
The same player is able to play this stream in IE11 without error.
These streams, each of which contains only one of the bitrates that play in the above sequence, both play without error in Chrome, so it's not that the underlying media is just somehow corrupt.
http://54.241.9.147/new-fandor/vod/21/2157/dark_star_FILM_v11_0.smil/manifest_mpm4sav_mvlist.mpd
http://54.241.9.147/new-fandor/vod/21/2157/dark_star_FILM_v11_1.smil/manifest_mpm4sav_mvlist.mpd
Any ideas?
A Chromium person says that this is due to audio sample rate switching, which Chrome doesn't support yet: https://code.google.com/p/chromium/issues/detail?id=315330 Although each of our videos' video bitrates should have had the same audio bitrate, some videos had different audio bitrates for different video bitrates. The solution was to reencode those videos correctly.

Embedding Media Players for streaming audio

I already know that there are flash and html5 players that can effectively stream a shoutcast source. I also know that I can simply add a link to the .pls and have it open. I know that I can't force a specific player to be used on a users system. I've looked through various questions on this site and haven't actually found a solution.
I know that if I use a simple href command and point it to an m3u file I can pop open a window and then it will load whatever audio player a user has set as a default player. What I would like to do is provide a way for a person to click on a link for a specific type of player and then have that open an embedded player of that type. If the user does not have that plug-in or player installed it will offer the option to dowload and install. I've figured out how to embed a windows media player and a quicktime player but I haven't yet figured out how to embed real audio or a "default" (winamp vlc media) player. I'm hoping someone has an idea on how to accomplish this. Thanks.
All you need to do is embed the right content, and the system will use whatever plugin has registered itself to handle that content type.
<embed src="somefile.ra" />

Is it possible to play HTTP Live video on a browser? 2012

I was wondering if it is possible to stream to the user HTTP Live video. NOT a video file.
LIVE video. Using any technology. Flash or HTML5. I did some research and most of the players I found support live streaming but most of them mean youtube-like style streaming, where you can click further in the video and get the video loading. I am talking about LIVE video, happening the same time the user watches the video.
I got the incoming HTTP packets, stored in a file called "current_frame.h264". This files gets updated as the packets come in. It does NOT grow in size, just get updated. (stays at around 17-20kb). Now I want to take this file and show it on a browser. Anybody can help me out?
Apple developed something like that a while ago: https://developer.apple.com/streaming/
All you need is a webserver and a HTML5 enabled webpage.
With this technology you also have the ability to stream different quality files based on the connection (mobile usage for example)
https://developer.apple.com/library/archive/documentation/NetworkingInternet/Conceptual/StreamingMediaGuide/UsingHTTPLiveStreaming/UsingHTTPLiveStreaming.html