How can I watch my video from a sdp file? - html

I'm using ffmpeg to create a streaming. It works fine. I have a server and with ffplay I can watch my stream.
My only (big) constraint is real time.
I have to embed it into an HTML page accessible from mobile devices.
I tried with HTML5 video tag but I can't include sdp files into it.
With ffmpeg I create a stream from my webcam. I have also created the sdp file but in HTML5 doesn't work.
The code is here:
ffmpeg server:
sudo ffmpeg -re -f video4linux2 -i /dev/video0 -fflags no buffer rtp://224.10.20.30:20000
file.sdp
v=0
o=- 0 0 IN IP4 127.0.0.1
s=No Name
c=IN IP4 224.10.20.30
t=0 0
a=tool:libavformat 55.7.100
m=video 20000 RTP/AVP 96
b=AS:200
a=rtpmap:96 MP4V-ES/90000
a=fmtp:96 profile-level-id=1
ffplay: (It works)
ffplay file.sdp
How can I view the stream in a browser?

Related

WebRTC (using gstreamer and webrtcbin) works with VP9 but not with H264

I have a c++ application that gets the video in RTSP and H264 format from a camera using gstreamer an re-sends the videos using webrtcbin. I have followed the example from this link and I can see the video trough firefox (with the tips suggested in this post), when use VP9 encoding.
The pipeline I have used is:
rtspsrc location=rtsp://192.168.1.162/z3-1.mp4 ! application/x-rtp,encoding-name=H264 ! rtph264depay ! nvv4l2decoder ! nvv4l2vp9enc ! video/x-vp9 ! rtpvp9pay ! application/x-rtp,media=video,clock-rate=90000,encoding-name=VP9,payload=96,framerate=25/1 ! webrtcbin async-handling=true name=sendrecv
Although I follow the suggestion from the post), I can not see the video using Chrome but from the statistics in chrome://webrtc-internals/ it is clear that i am getting the video but chrome does not show the video.
Independently of the browser i have some issues to stream and see the video in 4k. Therefeore, i have decided to remove the VP9 enconding (to speedup the processing) and re-send the data directly in H264 from the camera. To do that i use the pipeline:
rtspsrc location=rtsp://192.168.1.162/z3-1.mp4 ! application/x-rtp,encoding-name=H264 ! webrtcbin async-handling=true name=sendrecv
After using this pipeline, i do not see the video neither in firefox or chrome. The interesing point is that if i analyze the traffic with firefox it looks like the browser is not getting any data. However in this case with chrome is getting mbytes of data but i do not see the video.
The answers of the negotiation are:
Firefox
v=0
o=mozilla...THIS_IS_SDPARTA-95.0.2 5069762040601189414 0 IN IP4 0.0.0.0
s=-
t=0 0
a=sendrecv
a=fingerprint:sha-256 ED:70:D8:AF:49:E9:B1:F8:47:83:1B:2B:13:D3:67:AD:F6:43:9D:36:59:8B:74:93:34:1D:AB:D5:67:1A:E4:07
a=ice-options:trickle
a=msid-semantic:WMS *
m=video 0 UDP/TLS/RTP/SAVPF 120
c=IN IP4 0.0.0.0
a=inactive
a=mid:video0
a=rtpmap:120 VP8/90000
Chrome
v=0
o=- 2541702691192041899 2 IN IP4 127.0.0.1
s=-
t=0 0
a=msid-semantic: WMS
m=video 9 UDP/TLS/RTP/SAVPF 96
c=IN IP4 0.0.0.0
a=rtcp:9 IN IP4 0.0.0.0
a=ice-ufrag:ssce
a=ice-pwd:eWXfMAvg/KEFxesG2nS3aNTt
a=ice-options:trickle
a=fingerprint:sha-256 E0:79:E1:50:F6:9F:CB:8B:80:8A:40:5A:B9:1B:35:27:EF:A2:45:EC:A1:A7:58:B5:24:98:0C:8D:B0:41:3B:1C
a=setup:active
a=mid:video0
a=recvonly
a=rtcp-mux
a=rtcp-rsize
a=rtpmap:96 H264/90000
a=rtcp-fb:96 nack pli
a=fmtp:96 level-asymmetry-allowed=1;packetization-mode=1;profile-level-id=64001f
any idea about there is no video in firefox or in chrome? Any tip would be really helpful. Many thanks!
One possible reason browsers support only baseline profile encoded h264 streams.
You can try fool browsers by add something like capssetter caps=\"application/x-rtp,profile-level-id=(string)42c015\" in between rtph264pay and webrtcbin, but it will help only in some cases.

Asterisk gives "Strict RTP learning" message and no audio for Chrome WebRTC but works in Firefox

I've been experimenting with WebRTC with an Asterisk server (v13.18) on the same LAN as my computer. I configured the Asterisk extension 6003 to automatically answer and play a certain notorious sound file whenever it's dialed, then confirmed that this worked with the Ekiga softphone client.
I was then able to get this working as well in Firefox via the following steps:
Opened the online demo site https://www.doubango.org/sipml5/call.htm?svn=252
Opened the Expert Mode screen
Checked the "Disable Video" checkbox.
Entered [] for the "ICE servers" field (because I'm on a local LAN with no NAT involved, I don't need STUN or TURN, though I do have ICE enabled in my Asterisk config)
Entered my "Websocket Server URL" value of wss://asterisk-ci.test:8089/ws
Clicked Save and then returned to the demo page.
Entered the asterisk server info on the "Registration" section of the demo page and clicked "Login", confirming that it then displays a status of "Connected".
Entered the extension 6003 in the "Call Control" box and clicked "Call".
In Firefox this works great - the sound file is played back to me over the call.
In Google Chrome (latest v65) no sound actually plays but other than that everything seems like it should be working. In particular:
The sipML5 client displays "In Call" and the UI shows the call being active.
No errors in the Javascript console.
The SIP traffic in the Websocket Frames in the Network tab look good and seem to match what Firefox is doing.
The chrome://webrtc-internals page indicates a lot of traffic coming in. In particular, the graph of the data on the audio channel appears consistent with a sound file showing up here.
I tried setting up an example app using SIP.js and got the exact same results, confirming that this isn't an issue with sipML5 but is rather something about my Asterisk config and how it interacts with Google Chrome.
I connected to Asterisk via asterisk -vvvvvr to see what debug messages might show me, and there do appear to be some significant differences between the working Firefox and the nonworking Chrome. Here's what I see in Firefox when connecting and then making the call:
== WebSocket connection from '192.168.99.123:40190' for protocol 'sip' accepted using version '13'
-- Registered SIP '1061' at 192.168.99.123:40190
== DTLS ECDH initialized (automatic), faster PFS enabled
== Using SIP RTP CoS mark 5
> 0x7f79f800dba0 -- Strict RTP learning after remote address set to: 192.168.99.123:32807
-- Executing [6003#users:1] Answer("SIP/1061-00000007", "") in new stack
> 0x7f79f800dba0 -- Strict RTP learning after ICE completion
> 0x7f79f800dba0 -- Strict RTP switching to RTP target address 192.168.99.123:32807 as source
-- Executing [6003#users:2] Playback("SIP/1061-00000007", "auto-playback") in new stack
-- <SIP/1061-00000007> Playing 'auto-playback.slin' (language 'en')
> 0x7f79f800dba0 -- Strict RTP learning complete - Locking on source address 192.168.99.123:32807
-- Executing [6003#users:3] Hangup("SIP/1061-00000006", "") in new stack
But I get a very different result when connecting on Google Chrome:
== WebSocket connection from '192.168.99.123:52868' for protocol 'sip' accepted using version '13'
-- Registered SIP '1061' at 192.168.99.123:52868
== DTLS ECDH initialized (automatic), faster PFS enabled
== Using SIP RTP CoS mark 5
> 0x7f79fc00c710 -- Strict RTP learning after remote address set to: 127.0.0.1:9
-- Executing [6003#users:1] Answer("SIP/1061-00000008", "") in new stack
> 0x7f79fc00c710 -- Strict RTP learning after remote address set to: 192.168.99.123:39303
> 0x7f79fc00c710 -- Strict RTP learning after remote address set to: 192.168.99.123:39303
> 0x7f79fc00c710 -- Strict RTP learning after remote address set to: 192.168.99.123:39303
-- Executing [6003#users:2] Playback("SIP/1061-00000008", "auto-playback") in new stack
-- <SIP/1061-00000008> Playing 'auto-playback.slin' (language 'en')
> 0x7f79fc00c710 -- Strict RTP learning after remote address set to: 192.168.99.123:39303
> 0x7f79fc00c710 -- Strict RTP learning after remote address set to: 192.168.99.123:39303
The message 0x7f79fc00c710 -- Strict RTP learning after remote address set to: 192.168.99.123:39303 then repeats ad infinitum for the duration of the call.
In addition to that message repeating, I notice that on Firefox, the original "Strict RTP learning" message has the correct address, whereas on Google Chrome is has 127.0.0.1:9. Both the 127.0.0.1 and the use of port 9 is interesting, though I'm not sure what to make of either. Does Google Chrome hide your IP address in a way which is messing with Asterisk?
Interestingly, when I try the same thing using a SIP.js example, I get exactly the same result (works on Firefox, connects but has no sound on Chrome) with the same debug output in Asterisk except that the initial address is 0.0.0.0:9 instead of 127.0.0.1:9.
Regardless I'm not sure what next steps to even try, so any help would be appreciated.
EDIT: As suggested, I'll post the SDP logs. Here's what I get for the working Firefox:
Local SDP (Offer)
v=0
o=mozilla...THIS_IS_SDPARTA-59.0.2 7697709853700369104 0 IN IP4 0.0.0.0
s=-
t=0 0
a=sendrecv
a=fingerprint:sha-256 BD:03:D7:1A:FB:F7:A3:BE:D0:F9:22:65:80:7B:FE:78:1C:17:01:17:99:57:A4:40:49:0D:EF:AA:AA:91:63:2C
a=group:BUNDLE sdparta_0
a=ice-options:trickle
a=msid-semantic:WMS *
m=audio 52547 UDP/TLS/RTP/SAVPF 109 9 0 8 101
c=IN IP4 192.168.99.123
a=candidate:0 1 UDP 2122252543 192.168.99.123 52547 typ host
a=candidate:1 1 TCP 2105524479 192.168.99.123 9 typ host tcptype active
a=candidate:0 2 UDP 2122252542 192.168.99.123 33797 typ host
a=candidate:1 2 TCP 2105524478 192.168.99.123 9 typ host tcptype active
a=sendrecv
a=end-of-candidates
a=extmap:1 urn:ietf:params:rtp-hdrext:ssrc-audio-level
a=extmap:2 urn:ietf:params:rtp-hdrext:sdes:mid
a=fmtp:109 maxplaybackrate=48000;stereo=1;useinbandfec=1
a=fmtp:101 0-15
a=ice-pwd:63350c71006d1daf78366efc8d05347f
a=ice-ufrag:e92ccf7b
a=mid:sdparta_0
a=msid:{8a0a921d-b591-41b5-94e7-647b9b40cd06} {78e4a3a8-628f-4e09-a05a-fa6edb3022be}
a=rtcp:33797 IN IP4 192.168.99.123
a=rtcp-mux
a=rtpmap:109 opus/48000/2
a=rtpmap:9 G722/8000/1
a=rtpmap:0 PCMU/8000
a=rtpmap:8 PCMA/8000
a=rtpmap:101 telephone-event/8000
a=setup:actpass
a=ssrc:1153204890 cname:{9de8930f-bf76-48e0-a9c9-4c15f6914409}
Remote SDP (Answer)
v=0
o=root 477460967 477460967 IN IP4 172.30.8.8
s=-
t=0 0
a=sendrecv
m=audio 18666 RTP/SAVPF 0 8 101
c=IN IP4 172.30.8.8
a=candidate:Hac1e0808 1 UDP 2130706431 172.30.8.8 18666 typ host
a=candidate:Hac1e0808 2 UDP 2130706430 172.30.8.8 18667 typ host
a=sendrecv
a=fingerprint:sha-256 75:D2:BE:77:B6:8E:1B:4E:F9:BF:FB:34:54:2D:05:31:F6:97:C5:34:F3:D9:65:BE:FC:C6:E4:5C:1A:5E:11:E7
a=fmtp:101 0-16
a=ice-pwd:1e0f3ac370cce57d7c978ecb57ae23d9
a=ice-ufrag:7189ea580175062f339a9fe84ed6ecae
a=maxptime:150
a=rtcp-mux
a=rtpmap:0 PCMU/8000
a=rtpmap:8 PCMA/8000
a=rtpmap:101 telephone-event/8000
a=setup:active
And here's what I see from the non-working Chrome, which also looks like SDP but looks so different than I e.g. don't even see my IP address in any of the output:
> createOfferOnSuccess
type: offer, sdp: v=0
o=- 3202047122122577027 2 IN IP4 127.0.0.1
s=-
t=0 0
a=group:BUNDLE audio
a=msid-semantic: WMS C0FThsSoaGKxFOoR8Fnptw8vJWdbuN4K2DeU
m=audio 9 UDP/TLS/RTP/SAVPF 111 103 104 9 0 8 106 105 13 110 112 113 126
c=IN IP4 0.0.0.0
a=rtcp:9 IN IP4 0.0.0.0
a=ice-ufrag:kQT+
a=ice-pwd:6BgMZ48o3m7PMPFXY7AZvfdb
a=ice-options:trickle
a=fingerprint:sha-256 59:9F:B3:53:89:64:3A:3F:03:1B:32:8F:97:9B:6E:A1:33:B8:05:DD:92:87:3C:1C:CA:A3:83:28:8D:2C:98:FE
a=setup:actpass
a=mid:audio
a=extmap:1 urn:ietf:params:rtp-hdrext:ssrc-audio-level
a=sendrecv
a=rtcp-mux
a=rtpmap:111 opus/48000/2
a=rtcp-fb:111 transport-cc
a=fmtp:111 minptime=10;useinbandfec=1
a=rtpmap:103 ISAC/16000
a=rtpmap:104 ISAC/32000
a=rtpmap:9 G722/8000
a=rtpmap:0 PCMU/8000
a=rtpmap:8 PCMA/8000
a=rtpmap:106 CN/32000
a=rtpmap:105 CN/16000
a=rtpmap:13 CN/8000
a=rtpmap:110 telephone-event/48000
a=rtpmap:112 telephone-event/32000
a=rtpmap:113 telephone-event/16000
a=rtpmap:126 telephone-event/8000
a=ssrc:3990625320 cname:R43Nh5Jptx9sDbOE
a=ssrc:3990625320 msid:C0FThsSoaGKxFOoR8Fnptw8vJWdbuN4K2DeU c42217e8-19c2-4d94-a392-d4166d00eb22
a=ssrc:3990625320 mslabel:C0FThsSoaGKxFOoR8Fnptw8vJWdbuN4K2DeU
a=ssrc:3990625320 label:c42217e8-19c2-4d94-a392-d4166d00eb22
> setLocalDescription
type: offer, sdp: v=0
o=- 3202047122122577027 2 IN IP4 127.0.0.1
s=-
t=0 0
a=group:BUNDLE audio
a=msid-semantic: WMS C0FThsSoaGKxFOoR8Fnptw8vJWdbuN4K2DeU
m=audio 9 UDP/TLS/RTP/SAVPF 111 103 104 9 0 8 106 105 13 110 112 113 126
c=IN IP4 0.0.0.0
a=rtcp:9 IN IP4 0.0.0.0
a=ice-ufrag:kQT+
a=ice-pwd:6BgMZ48o3m7PMPFXY7AZvfdb
a=ice-options:trickle
a=fingerprint:sha-256 59:9F:B3:53:89:64:3A:3F:03:1B:32:8F:97:9B:6E:A1:33:B8:05:DD:92:87:3C:1C:CA:A3:83:28:8D:2C:98:FE
a=setup:actpass
a=mid:audio
a=extmap:1 urn:ietf:params:rtp-hdrext:ssrc-audio-level
a=sendrecv
a=rtcp-mux
a=rtpmap:111 opus/48000/2
a=rtcp-fb:111 transport-cc
a=fmtp:111 minptime=10;useinbandfec=1
a=rtpmap:103 ISAC/16000
a=rtpmap:104 ISAC/32000
a=rtpmap:9 G722/8000
a=rtpmap:0 PCMU/8000
a=rtpmap:8 PCMA/8000
a=rtpmap:106 CN/32000
a=rtpmap:105 CN/16000
a=rtpmap:13 CN/8000
a=rtpmap:110 telephone-event/48000
a=rtpmap:112 telephone-event/32000
a=rtpmap:113 telephone-event/16000
a=rtpmap:126 telephone-event/8000
a=ssrc:3990625320 cname:R43Nh5Jptx9sDbOE
a=ssrc:3990625320 msid:C0FThsSoaGKxFOoR8Fnptw8vJWdbuN4K2DeU c42217e8-19c2-4d94-a392-d4166d00eb22
a=ssrc:3990625320 mslabel:C0FThsSoaGKxFOoR8Fnptw8vJWdbuN4K2DeU
a=ssrc:3990625320 label:c42217e8-19c2-4d94-a392-d4166d00eb22
> setRemoteDescription
type: answer, sdp: v=0
o=root 2070370846 2070370846 IN IP4 172.30.8.8
s=Asterisk PBX certified/13.18-cert2
c=IN IP4 172.30.8.8
t=0 0
m=audio 13528 RTP/SAVPF 0 8 126
a=rtpmap:0 PCMU/8000
a=rtpmap:8 PCMA/8000
a=rtpmap:126 telephone-event/8000
a=fmtp:126 0-16
a=maxptime:150
a=ice-ufrag:4c551f814a951c5e6f74e5c225c5e160
a=ice-pwd:7877be1235781d443361467a70b33c12
a=candidate:Hac1e0808 1 UDP 2130706431 172.30.8.8 13528 typ host
a=candidate:Hac1e0808 2 UDP 2130706430 172.30.8.8 13529 typ host
a=connection:new
a=setup:active
a=fingerprint:SHA-256 75:D2:BE:77:B6:8E:1B:4E:F9:BF:FB:34:54:2D:05:31:F6:97:C5:34:F3:D9:65:BE:FC:C6:E4:5C:1A:5E:11:E7
a=rtcp-mux
a=sendrecv
And in case it helps, here's the complete set of events logged when setting up the call and playing it:
addStream
createOffer
negotiationneeded
createOfferOnSuccess
setLocalDescription
signalingstatechange
setLocalDescriptionOnSuccess
icegatheringstatechange
icegatheringstatechange
setRemoteDescription
signalingstatechange
iceconnectionstatechange
onAddStream
setRemoteDescriptionOnSuccess
FURTHER EDIT: After reviewing some SDP docs and looking through my own SDP logs above, the main thing I see that differs and probably accounts for Firefox working and Chrome not is that Firefox has the line
c=IN IP4 192.168.99.123
which is indeed my IP address, whereas Chrome has the line
c=IN IP4 0.0.0.0
I tried running Chrome from a terminal to capture any debug output that gets printed to the screen apart from what I see in chrome://webrtc-internals and I found that this message is displayed many times per second:
ERROR:dtlstransport.cc(557)] Jingle:DtlsTransport[audio|1|__]: Received non-DTLS packet before DTLS complete.
I've read through a number of Google search results for that error but haven't been able to come up with anything to try to fix it. However, it seems possibly related; if one or more UDP packets went to the wrong place, then even if most of the audio packets were properly sent, they'd never get decoded and we'd see a lot of data coming in but no audio actually being played. Which is indeed what I'm seeing.
I'll do some more digging to see what settings I can tweak to make Chrome send the same sort of information that Firefox is sending, or to have Asterisk do the correct thing for both of them. In the meantime, I'm opening a Bounty on this question, since any additional help and suggestions will be much appreciated.
Clear your Chrome cache, specifically cookies and cached files.
go to chrome://net-internals/#dns
click the Clear host cache
Such as check if the DNS prefetching is disable or not chrome://dns
If DNS prefetching is not disabled => you can see tables.
And restarted chrome.
dns-prefetching
Only difference i observed is Strict RTP learning complete is happened for chrome.
And there are no candidates in chrome Offer, so issue may be in ICE Candidates Negotiation. Without network sniffer, identifying issue is bit difficult.
Read SDP basics here: https://webrtchacks.com/sdp-anatomy/
Try playing with strictrtp mode.
ice_support=yes
rtcp_mux=yes
It could be that chrome is ignoring your local hosts, if you are in ubuntu this should be /etc/hosts file.
So your local "asterisk-ci.test" domain is not resolved to "192.168.99.123".
Try clearing chrome `s host cache:
Go to chrome://net-internals/#dns at your chrome browser
Press clear host cache
To test this, try to "Enter "Websocket Server URL" value of wss://192.168.99.123:8089/ws". So you are using directly the local ip.
References
Localhost not working in Chrome, 127.0.0.1 does work
Why is Chromium bypassing /etc/hosts and dnsmasq
Edit: Clear the socket pools also by visiting "chrome://net-internals/#sockets" in chrome and clicking "Flush Socket Pools".
how to clear dns cache in chrome
You can also try to turn OFF "Protect you and your device from dangerous sites" in Chrome's Advanced Preferences.(answer at su stackexchange)

Realtime streaming video to HTML5 with RaspberryPi, gstreamer

I'm trying to make a live stream from a Raspberry camera available on a HTML5 webpage. Because of combination of factors, I would like to stream it to an outside server pc(Server pc os is window7) and this server should be able to supply the streams to the webpage HTML.
I'm able to get the stream from the Raspberry Pi and stream it with Gstreamer to an external server like this:
Raspberry Pi:
raspivid -n -t 0 -rot 270 -w 960 -h 720 -fps 30 -b 2000000 -o - | gst-launch-1.0 - e -vvvv fdsrc ! h264parse ! rtph24pay pt=96 config-interval=1 ! udpsink host=External IP port=5000
External server
gst-launch-1.0 -e -v udpsrc port=5000 ! application/x-rtp, payload=96 ! rtpjitterbuffer ! rtph264depay ! avdec_h264 ! fpsdisplaysink sync=false text-overlay=false
As a result I could display live video stream through gstreamer(GStreamer D3D video sink) in external server pc.
Now I have a problem:
I want to display this as HTML 5 video with Apache on server side (PC) instead of GStreamer D3D video output.
I searched for this solution for a long time but I couldn't find anything.

How to setup HLS and DVR functionality and read it?

I want to create an audio stream live with the DVR functionality.
In my player I want to listen live or seek the past stream (few minutes ago).
I use nginx to serve the hls stream.
How to setup the DVR functionality? Do I use a specific module in nginx for live stream and past stream with a param like past.m3u8?seek=timestamp
I'm also looking for a web player in html5 and fallback in flash to read the live stream and can seek the past.m3u8 stream.
I've setup nginx with this config:
location /hls {
# Serve HLS fragments
types {
application/vnd.apple.mpegurl m3u8;
video/mp2t ts;
}
root /tmp;
add_header Cache-Control no-cache;
}
I create hls stream with avconv.
avconv -i [input] -vn -sn -c:a libfdk_aac -b:a 64k -hls_time 10 /path/to/hls/playlist.m3u8
To read this stream I use clipr player who permits to seek the stream.

FFmpeg Converted files not working on firefox

When i convert a file with ffmpeg and play the video in firefox i get this error :
"VIDEOJS:" "ERROR:" "(CODE:3 MEDIA_ERR_DECODE)" "The video playback was aborted due to a corruption problem or because the video used features your browser did not support." Object { code: 3, message: "The video playback was aborted due to a corruption problem or because the video used features your browser did not support." }
In other browsers it's working perfect.
This is my ffmpeg convert command:
ffmpeg -i {input} -b 5500k -minrate 5500k -maxrate 5500k -bufsize 5500k -ab 384k -vcodec libx264 -acodec aac -strict -2 -ac 2 -ar 96000 -s 1280x720 -y {output}
Can someone tell me why the videos won't play in firefox?