I downloaded the library as3-websocket-server for a desktop AIR app and it works fine on Chrome.
When I use Firefox it is ok for HTTP pages, while I get an error message if I test it on HTTPS pages:
SecurityError: The operation is insecure."
I read that I should use wss:// rather than ws://, but the AIR server doesn't work with this protocol.
I tried to convert all "Socket" to "SecureSocket" in the Air App but I get an error:
1118: Implicit coercion of a value with static type flash.net:Socket
to a possibly unrelated type flash.net:SecureSocket
Any idea on how to have the server available for both protocols? On the client side I could just check if I'm on HTTP or HTTPS and call the right websocket.
Thanks
Related
I followed a tutorial on how to make a multiplayer tetris game, here is the repo:
https://github.com/Leftier/tetris
It worked just fine on localhost so I tried to deploy it in heroku (https://tetrixtest.herokuapp.com/ --ASD to move Q/E to rotate) but I get the following error:
WebSocket connection to 'wss://tetrixtest.herokuapp.com/' failed: Error during WebSocket handshake: Unexpected response code: 200
while trying to create the webSocket in this line (connection-manager.js line 14):
this.conn = new WebSocket(`wss://${window.location.hostname}:${window.location.port}`)
I don't know much about webSockets,
at first I thought that heroku was not able to handle websockets but that wasn't the case so I tried using the link directly as an argument instead of reading it from the browser but still the same issue.
I would like some clues/hints about why does this happens, I searched in google and github, but I only found issues related to socket.io
For me the solution was to turn on "Session affinity" by running this command heroku features:enable http-session-affinity
More info at https://devcenter.heroku.com/articles/session-affinity
Session affinity, sometimes referred to as sticky sessions, is a
platform feature that associates all HTTP requests coming from an
end-user with a single application instance (web dyno).
I'm using Chrome's native messaging API to connect to a native host I'm developing in Go with the Cobra library. The native application has a standalone CLI (implemented with Cobra), and the bare command (without any arguments) starts listening for JSON via stdin, which is meant to be an API for Chrome to interact with.
However, it fails every time the extension makes requests to the native messaging host (the client just immediately disconnects from the process). When I start Chrome with the --enable-logging flag I can see that the native host is erroring with unknown command "chrome-extension://cnjopnegooahjdngnkhiokognkdjiioc/" for "--native-app-name--". This is Cobra's error message that means "chrome-extension://cnjopnegooahjdngnkhiokognkdjiioc/" is being used as an argument, which seems to mean that Chrome is invoking the native host with app-name chrome-extension://cnjopnegooahjdngnkhiokognkdjiioc/ instead of just app-name.
Here's the code I'm using from the extension to call the native host:
var port = chrome.runtime.connectNative('app-name');
port.onMessage.addListener(function(msg) {
console.log(msg);
});
port.onDisconnect.addListener(function() {
console.log("disconnected");
});
port.postMessage({cmd:"ping"});
I can't find any documentation that suggests that Chrome sends the extension address as an argument, or whether it can be prevented.
It's part of the protocol and can't be disabled. The command line on Windows is something like this:
C:\Windows\system32\cmd.exe /c YOURHOSTAPP.exe chrome-extension://.................../
--parent-window=6752474 < \\.\pipe\chrome.nativeMessaging.in.e11ed8be274e1a85
> \\.\pipe\chrome.nativeMessaging.out.e11ed8be274e1a85
The first argument to the native messaging host is the origin of the caller, usually chrome-extension://[ID of whitelisted extension]. This allows native messaging hosts to identify the source of the message when multiple extensions are specified in the allowed_origins key in the native messaging host manifest.
On Windows, the native messaging host is also passed a command line argument with a handle to the calling chrome native window: --parent-window=<decimal handle value>. This lets the native messaging host create native UI windows that are correctly focused.
Warning: In Windows, in Chrome 54 and earlier, the origin was passed as the second parameter instead of the first parameter.
We are having problems with the exporting of the CEP definition file from the authoring tool to an (external) repository.
In the response preview from the developer tools of the browser we get the following error message:
"HTTP Status 500 - A javax.ws.rs.ext.MessageBodyReader implementation was not found for class org.apache.wink.json4j.JSONArray type and text/html;charset=utf-8 media type. Verify that all entity providers are correctly registered. Add a custom javax.ws.rs.ext.MessageBodyReader provider to handle the type and media type if a JAX-RS entity provider does not currently exist."
How can we make sure we are able to export to an external repository?
Your problem is that the external repository is not available.
Since you didn't mention this, my guess is that you are using the default external repository which is http://localhost:8080/ProtonOnWebServerAdmin/resources/definitions
but don't have a running instance of ProtonOnWebServerAdmin. You have got to have ProtonOnWebServerAdmin running on a Tomcat server on your local machine for it to actually process the request.
If you're using anything else - make sure that repository knows how to handle the request.
I am trying to write a use-anywhere (web, air, mobile) OAuth library for AS3 that is flexible enough to use with any OAuth site or near OAuth.
My sample app I am writing authenticates with Google and I want to write an app that uses google drive.
At the moment the Air and mobile apps work fine but the web flash player app keeps giving me this error:
Error #2044: Unhandled securityError:. text=Error #2048: Security sandbox violation: http://localhost:81/OAuthWebExample.swf cannot load data from https://accounts.google.com/o/oauth2/token.
(I get the same error when on a non-localhost domain on port 80)
I have looked at https://accounts.google.com/crossdomain.xml which has:
<site-control permitted-cross-domain-policies="by-content-type" />
I am not sure what that means...
I am pretty sure that it is possible to get flash to talk to these google APIs. What can I do to get this to work?
(I'm not interested in the "work round" where you use feedburner or something similar to proxy these calls thanks).
I have looked at https://accounts.google.com/crossdomain.xml
It's the master policy file and it doesn't grant permissions to the content of accounts.google.com domain (there isn't allow-access-from nodes withing it), so flash player fires securityError
I am not sure what that means...
It means:
by-content-type: [HTTP/HTTPS only] Only policy files served with
Content-Type: text/x-cross-domain-policy are allowed
So it seems it's designed for sub-domain services and child crossdomain.xml files so you can't load data directly from accounts.google.com. I found the same issue with Google OAuth for flash Google Oauth crossdomain.xml problem with Flex and they have to use old AuthSub (it uses accounts.googleapis.com with proper crossdomain.xml) to solve the auth problem and it seem nothing were changed for the past two years.
Check the Flash player security on compile, local or external.
The AIR applications can connect to external and local files, but a embedded swf can't do it
https://www.adobe.com/security/flashplayer/articles/localcontent/
I am a newbie to webrtc2sip. I have setup my webrtc2sip gateway and registered to sip2sip.info as my domain. The problem is when I make video calls from chrome to any SIP client(ekiga/jitsi) the call gets connected but I am unable to see videos on both the sides.
==================================================================================
Case 1: Chrome calls SIP client
Result: No video shown on both transmit and receive side
==================================================================================
On the chrome JS console it says that :
State machine: tsip_dialog_register_InProgress_2_Connected_X_2xx SIPml-api.js?svn=179:1
==session event = m_stream_video_local_added SIPml-api.js?svn=179:1
==session event = m_stream_video_remote_added SIPml-api.js?svn=179:1
==session event = m_stream_audio_local_added SIPml-api.js?svn=179:1
==session event = m_stream_audio_remote_added SIPml-api.js?svn=179:1
I have attached the JS console logs(case1_web2SIPClient_JSLogs.txt), wireshark trace(case1_web2SIPClient_WStrace.pcap) , webrtc2sip gateway console logs(case1_web2SIPClient_gatewayLogs.txt), sipml5 expert settings (Expert_settings.png) and config.xml (config.xml) for this case. I did not change anything in the config.xml that was generated after i built the source as mentioned in the instructions of this page (http://linux.autostatic.com/installing-webrtc2sip-on-ubuntu-1204).
I gave a try making calls between chrome and a android SIP client (CSipSimple) and the problem remains the same.
==================================================================================
case 2: SIP client calling chrome.
Result: as soon as I click answer button on chrome, the calls gets rejected.
==================================================================================
The JS console logs states that:
State machine: tsip_transac_ist_Proceeding_2_Completed_X_300_to_699 SIPml-api.js?svn=179:1
SEND: SIP/2.0 603 Failed to get local SDP
Via: SIP/2.0/WS 172.21.128.118:10060;rport=10060;branch=z9hG4bK-1441398960
From: <sip:tata#172.21.229.127>;tag=300647977
To: <sip:amshyam320#sip2sip.info>;tag=ZxQFfM7fIIP3rT1HINzb
Call-ID: fbdf5a11-ff9e-0072-fa8b-09525220cec6
CSeq: 1670757835 INVITE
Content-Length: 0
Reason: SIP; cause=603; text="Failed to get local SDP"
For this case I am attaching JS logs(case2_SIPClient2WebJSLogs.txt), wireshark dump(case2_jitsiToWeb_WStrace.pcap)
Configuration:
Chrome Version: checked on 30.0.1599.114 and even on Latest chrome version
Webrtc2sip version: 2.6.0
sipml5 Version: svn=203
ubuntu version: 12.04 (checked on both desktop and server editions)
Am I missing something in my setup or configuration please guide and help in moving further.
Thanks,
Shyam
Case2:
You're using RTCWeb-capable browser(Chrome) and trying to call a SIP client which may not be implementing some mandatory features like ICE,SRTP. Chrome uses SRTP-SDES and Firefox uses SRTP-DTLS.
Enable RTCWeb Breaker in sipml5 expert settings and check.
The RTCWeb Breaker is used to enable audio and video transcoding when the endpoints do not support the same codecs or the remote server is not RTCWeb-compliant.
Case:1:
Is audio working? and I can't see your logs.