GUI version of linphone is configured with h264 codec and works well. But linphoneC (command line version), which is in the folder with linphone, in response to the command "vcodec list" returns only "0: VP8 (90000) enabled" and not work with h264. How to configure linphoneC to work with h264? Thank you!
Related
I wanted to try GStreamer to connect to a remote IP camera using RTSP stream.
So I've downloaded and installed the last version.
On Using GStreamer page said: GStreamer also provides playbin, a basic media-playback plugin that automatically takes care of most playback details
So I've tried to connect to the camera using the following command:
gst-launch-1.0 playbin uri=rtsp://192.168.1.2:554/av0_0
Unfortunately I get error:
ERROR: from element
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0: Your GStreamer
installation is missing a plug-in. Additional debug info:
../gst/playback/gsturidecodebin.c(988): no_more_pads_full ():
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0: no suitable
plugins found: ../gst/playback/gstdecodebin2.c(4679):
gst_decode_bin_expose ():
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0/GstDecodeBin:decodebin0:
no suitable plugins found: Missing decoder: H.264 (Main Profile)
(video/x-h264, stream-format=(string)byte-stream,
alignment=(string)au, level=(string)3.1, profile=(string)main,
width=(int)1280, height=(int)720, framerate=(fraction)0/1,
interlace-mode=(string)progressive, chroma-format=(string)4:2:0,
bit-depth-luma=(uint)8, bit-depth-chroma=(uint)8,
parsed=(boolean)true)
Does it mean that my installation missed decoder for h264? How could it been? I guess that h264 is the most popular codec. Did I do something wrong?
Update:
./gst-inspect-1.0 | grep h264
videoparsersbad: h264parse: H.264 parser
typefindfunctions: video/x-h264: h264, x264, 264
rtp: rtph264pay: RTP H264 payloader
rtp: rtph264depay: RTP H264 depayloader
openh264: openh264enc: OpenH264 video encoder
openh264: openh264dec: OpenH264 video decoder
Have you installed GStreamer MSI with the "Complete" option? If it's installed in Typical mode, then the libav plugins won't be installed. You can change the installation and select the additional feature or do a re-install.
When you install in the Complete mode, you should be able to find avdec_h264. Alternatively, you can select Custom mode and then enable the "GStreamer 1.0 libav wrapper".
$ gst-inspect-1.0.exe libav | grep h264
avdec_h264: libav H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 decoder
I downloaded the library as3-websocket-server for a desktop AIR app and it works fine on Chrome.
When I use Firefox it is ok for HTTP pages, while I get an error message if I test it on HTTPS pages:
SecurityError: The operation is insecure."
I read that I should use wss:// rather than ws://, but the AIR server doesn't work with this protocol.
I tried to convert all "Socket" to "SecureSocket" in the Air App but I get an error:
1118: Implicit coercion of a value with static type flash.net:Socket
to a possibly unrelated type flash.net:SecureSocket
Any idea on how to have the server available for both protocols? On the client side I could just check if I'm on HTTP or HTTPS and call the right websocket.
Thanks
I want to create an audio stream live with the DVR functionality.
In my player I want to listen live or seek the past stream (few minutes ago).
I use nginx to serve the hls stream.
How to setup the DVR functionality? Do I use a specific module in nginx for live stream and past stream with a param like past.m3u8?seek=timestamp
I'm also looking for a web player in html5 and fallback in flash to read the live stream and can seek the past.m3u8 stream.
I've setup nginx with this config:
location /hls {
# Serve HLS fragments
types {
application/vnd.apple.mpegurl m3u8;
video/mp2t ts;
}
root /tmp;
add_header Cache-Control no-cache;
}
I create hls stream with avconv.
avconv -i [input] -vn -sn -c:a libfdk_aac -b:a 64k -hls_time 10 /path/to/hls/playlist.m3u8
To read this stream I use clipr player who permits to seek the stream.
I would like to test a Chrome App with the help of WebDriver.
Normally when you want to directly start an App you pass the argument --app-id for e.g:
/opt/google/chrome/google-chrome --app-id=olddgoefnehjeogjhpcpolcnnifnglkp
Bur I'm not able to start my Chrome App with chromedriver:
I tried:
Selenium::WebDriver.for :chrome, :args => ["--app-id=olddgoefnehjeogjhpcpolcnnifnglkp"]
Steps to reproduce:
Go to the Chrome Web Store
Install an App
Go to preferences --> Extensions
Enable Developer-Modus
Now you can see the App-Id for eg: ID: olddgoefnehjeogjhpcpolcnnifnglkp
Try to start WebDriver with this option:
driver = Selenium::WebDriver.for :chrome, :args => ["--app-id=olddgoefnehjeogjhpcpolcnnifnglkp"]
Here is a list of arguments that should be valid for chromedriver:
http://peter.sh/experiments/chromium-command-line-switches/
Thanks for your help!
I am a newbie to webrtc2sip. I have setup my webrtc2sip gateway and registered to sip2sip.info as my domain. The problem is when I make video calls from chrome to any SIP client(ekiga/jitsi) the call gets connected but I am unable to see videos on both the sides.
==================================================================================
Case 1: Chrome calls SIP client
Result: No video shown on both transmit and receive side
==================================================================================
On the chrome JS console it says that :
State machine: tsip_dialog_register_InProgress_2_Connected_X_2xx SIPml-api.js?svn=179:1
==session event = m_stream_video_local_added SIPml-api.js?svn=179:1
==session event = m_stream_video_remote_added SIPml-api.js?svn=179:1
==session event = m_stream_audio_local_added SIPml-api.js?svn=179:1
==session event = m_stream_audio_remote_added SIPml-api.js?svn=179:1
I have attached the JS console logs(case1_web2SIPClient_JSLogs.txt), wireshark trace(case1_web2SIPClient_WStrace.pcap) , webrtc2sip gateway console logs(case1_web2SIPClient_gatewayLogs.txt), sipml5 expert settings (Expert_settings.png) and config.xml (config.xml) for this case. I did not change anything in the config.xml that was generated after i built the source as mentioned in the instructions of this page (http://linux.autostatic.com/installing-webrtc2sip-on-ubuntu-1204).
I gave a try making calls between chrome and a android SIP client (CSipSimple) and the problem remains the same.
==================================================================================
case 2: SIP client calling chrome.
Result: as soon as I click answer button on chrome, the calls gets rejected.
==================================================================================
The JS console logs states that:
State machine: tsip_transac_ist_Proceeding_2_Completed_X_300_to_699 SIPml-api.js?svn=179:1
SEND: SIP/2.0 603 Failed to get local SDP
Via: SIP/2.0/WS 172.21.128.118:10060;rport=10060;branch=z9hG4bK-1441398960
From: <sip:tata#172.21.229.127>;tag=300647977
To: <sip:amshyam320#sip2sip.info>;tag=ZxQFfM7fIIP3rT1HINzb
Call-ID: fbdf5a11-ff9e-0072-fa8b-09525220cec6
CSeq: 1670757835 INVITE
Content-Length: 0
Reason: SIP; cause=603; text="Failed to get local SDP"
For this case I am attaching JS logs(case2_SIPClient2WebJSLogs.txt), wireshark dump(case2_jitsiToWeb_WStrace.pcap)
Configuration:
Chrome Version: checked on 30.0.1599.114 and even on Latest chrome version
Webrtc2sip version: 2.6.0
sipml5 Version: svn=203
ubuntu version: 12.04 (checked on both desktop and server editions)
Am I missing something in my setup or configuration please guide and help in moving further.
Thanks,
Shyam
Case2:
You're using RTCWeb-capable browser(Chrome) and trying to call a SIP client which may not be implementing some mandatory features like ICE,SRTP. Chrome uses SRTP-SDES and Firefox uses SRTP-DTLS.
Enable RTCWeb Breaker in sipml5 expert settings and check.
The RTCWeb Breaker is used to enable audio and video transcoding when the endpoints do not support the same codecs or the remote server is not RTCWeb-compliant.
Case:1:
Is audio working? and I can't see your logs.