AudioSessionAddPropertyListener call-back function called for no reason - listener

In order to keep an eye, on an earphone possibly plugged or unplugged, in my iPhone app and react properly; I use the following kind of code, in a few of my classes:
- (void)viewDidLoad
{
[super viewDidLoad];
…..
routeChangeID=kAudioSessionProperty_AudioRouteChange;
AudioSessionAddPropertyListener(routeChangeID,rvcHandleRouteChange,(__bridge void *)(self));
}
…….
void rvcHandleRouteChange(void *inUserData,AudioSessionPropertyID inPropertyID,
UInt32 inPropertyValueSize,const void *inPropertyValue)
{
NSLog(#"Hi rvcHandleRouteChange has been called.");
if (inPropertyID!=kAudioSessionProperty_AudioRouteChange) NSLog(#"WRONG CALL!!!");
// Do some useful work ….
}
That seems to work rather well, except in one case where the rvcHandleRouteChange call back function gets called with no apparent reason. Even with the test to filter the wrong calls, none of them appear to be "WRONG CALLs".
I mean it gets called without me to plug or unplug any earphone.
As a consequence this gives me a lot of trouble.
Anyone has an idea of why this could happen?

1: A route change call can happen even twice. For example if you plug in your headphones (same route change reason code).
2: RouteChange gets called once you set your audio session active. That means at least once.
Maybe you are implementing your own audio interruptions where you activate/deactivate audio sessions?
Here is mine route change listener for any use (I use playAndRecord category) [updated to iOS7]:
#pragma mark Route change listener
// *********************************************************************************************************
// *********** Route change listener ***********************************************************************
// *********************************************************************************************************
-(void)routeChanged:(NSNotification*)notification {
NSLog(#"]-----------------[ Audio Route Change ]--------------------[");
AVAudioSession *session = [AVAudioSession sharedInstance];
//AVAudioSessionRouteDescription* prevRoute = [[notification userInfo] objectForKey:AVAudioSessionRouteChangePreviousRouteKey];
// Reason
NSInteger reason = [[[notification userInfo] objectForKey:AVAudioSessionRouteChangeReasonKey] integerValue];
switch (reason) {
case AVAudioSessionRouteChangeReasonNoSuitableRouteForCategory:
NSLog(#"] Audio Route: The route changed because no suitable route is now available for the specified category.");
break;
case AVAudioSessionRouteChangeReasonWakeFromSleep:
NSLog(#"] Audio Route: The route changed when the device woke up from sleep.");
break;
case AVAudioSessionRouteChangeReasonOverride:
NSLog(#"] Audio Route: The output route was overridden by the app.");
break;
case AVAudioSessionRouteChangeReasonCategoryChange:
NSLog(#"] Audio Route: The category of the session object changed.");
break;
case AVAudioSessionRouteChangeReasonOldDeviceUnavailable:
NSLog(#"] Audio Route: The previous audio output path is no longer available.");
break;
case AVAudioSessionRouteChangeReasonNewDeviceAvailable:
NSLog(#"] Audio Route: A preferred new audio output path is now available.");
break;
case AVAudioSessionRouteChangeReasonUnknown:
NSLog(#"] Audio Route: The reason for the change is unknown.");
break;
default:
NSLog(#"] Audio Route: The reason for the change is very unknown.");
break;
}
// Output
AVAudioSessionPortDescription *output = [[session.currentRoute.outputs count]?session.currentRoute.outputs:nil objectAtIndex:0];
if ([output.portType isEqualToString:AVAudioSessionPortLineOut]) {
NSLog(#"] Audio Route: Output Port: LineOut");
}
else if ([output.portType isEqualToString:AVAudioSessionPortHeadphones]) {
NSLog(#"] Audio Route: Output Port: Headphones");
}
else if ([output.portType isEqualToString:AVAudioSessionPortBluetoothA2DP]) {
NSLog(#"] Audio Route: Output Port: BluetoothA2DP");
}
else if ([output.portType isEqualToString:AVAudioSessionPortBuiltInReceiver]) {
NSLog(#"] Audio Route: Output Port: BuiltInReceiver");
}
else if ([output.portType isEqualToString:AVAudioSessionPortBuiltInSpeaker]) {
NSLog(#"] Audio Route: Output Port: BuiltInSpeaker");
}
else if ([output.portType isEqualToString:AVAudioSessionPortHDMI]) {
NSLog(#"] Audio Route: Output Port: HDMI");
}
else if ([output.portType isEqualToString:AVAudioSessionPortAirPlay]) {
NSLog(#"] Audio Route: Output Port: AirPlay");
}
else if ([output.portType isEqualToString:AVAudioSessionPortBluetoothLE]) {
NSLog(#"] Audio Route: Output Port: BluetoothLE");
}
else {
NSLog(#"] Audio Route: Output Port: Unknown: %#",output.portType);
}
// Input
AVAudioSessionPortDescription *input = [[session.currentRoute.inputs count] ? session.currentRoute.inputs:nil objectAtIndex:0];
if ([input.portType isEqualToString:AVAudioSessionPortLineIn]) {
NSLog(#"] Audio Route: Input Port: LineIn");
}
else if ([input.portType isEqualToString:AVAudioSessionPortBuiltInMic]) {
NSLog(#"] Audio Route: Input Port: BuiltInMic");
}
else if ([input.portType isEqualToString:AVAudioSessionPortHeadsetMic]) {
NSLog(#"] Audio Route: Input Port: HeadsetMic");
}
else if ([input.portType isEqualToString:AVAudioSessionPortBluetoothHFP]) {
NSLog(#"] Audio Route: Input Port: BluetoothHFP");
}
else if ([input.portType isEqualToString:AVAudioSessionPortUSBAudio]) {
NSLog(#"] Audio Route: Input Port: USBAudio");
}
else if ([input.portType isEqualToString:AVAudioSessionPortCarAudio]) {
NSLog(#"] Audio Route: Input Port: CarAudio");
}
else {
NSLog(#"] Audio Input Port: Unknown: %#",input.portType);
}
NSLog(#"]--------------------------[ ]-----------------------------[");
}
Remember to add observer since the audio session's delegate is deprecated too:
[[NSNotificationCenter defaultCenter] addObserver: self
selector: #selector(audioInterruption:)
name: AVAudioSessionInterruptionNotification
object: nil];

Related

Low Latency (50ms) Video Streaming with NODE.JS and html5

OBJECTIVE:
I'm building a FPV robot, I want to control it with a with a webbrowser over a local wi-fi connection.
I'm using a raspberry pi 3B+ with Raspbian Stretch. I built my own motor control and power regulator hat.
After lots of research testing, I decided to use node.JS as http server and socket.io to provide a low latency bidirectional communication with my robot. This stack achieve about 7ms of latency.
Picture of the robot
PROBLEM:
I need to stream low latency video from an USB camera attached to the RPI to the browser. My target is to achieve at least 640x480 resolution at 10FPS with 50ms of latency or better. I'm happy sacrificing visual fedelity to get a quicker response from my robot.
If possible I would like to stream in UDP to improve the reliability of the stream.
If possible I would like to stream a video that modern webbrowsers can natively decode. I'd like to use a H264 codec and the HTML5 video tag.
I can fall back to use a javascript player if there is no other option.
WHAT I TRIED:
I did an extensive research and tried many tools.
Amongst other, I tried VLC, mjpg streamer, gstreamer and raspivid. A few times I got to a stream the webbrowser could view, but at best I got a latency of 700ms at 320x240. Very very far from my target.
Currently I'm looking into WebRTC solutions.
QUESTION:
I'd like suggestions for NODE.JS packages or other solutions to provide a UDP H264 video stream that can be decoded by an HTML5 video tag with a target latency of 50ms.
Thanks
UPDATE:
Thanks for your answers! I'll keep updating this question and I'll post the solution once it works.
PUSH INDIVIDUAL FRAMES
I tried a different approach by pushing individual 200KB 640x480 jpg frame through websocket and I got a latency of about 190ms. I can probably do a lot better by reusing objects but I'm putting this attempt in hold for now.
UPDATE2:
While researching WebRTC I found a stack that looked easy enough.
Server side it uses V4L2 as driver, FFMPEG to transcode into an MPEG1 http stream with TS encapsulation locally, node js to flip the stream into a websocket.
Client side there is a javascript that decode the MPEG1 TS stream and paint a canvas object into the HTML page.
It achieves 640x480#20FPS with 240mS of latency.
Good enough for an MVP, but I'll keep working to get it down.
Code in the answer.
I adapted code from here and integrated it with an http server and socket.io controls:
https://github.com/phoboslab/jsmpeg
Server:
V4L2 -> FFMPEG (MPEG1 TS) -> NODE HTTP Server -> NODE Websocket broadcast
Client:
Websocket -> Javascript (Decode MPEG1 TS and paint to html canvas) -> Html Canvas
This stack achieve 640x480#20FPS with 240ms of latency. Still far from my target but good enough as MVP. The controls in both directions have a latency of 7ms, which is excellent.
This stack is held back by the transcoding and decoding stage, and the RPI gets really hot. The transport of raw data through websocket looks good, I'm going to profile the latency of each steps in the future.
Execution:
pi#MazeRunner:~ $ node node.js &
pi#MazeRunner:~ $ ffmpeg -f v4l2 -framerate 20 -video_size 640x480 -i /dev/video0 -f mpegts -codec:v mpeg1video -s 640x480 -b:v 600k -bf 0 http://localhost:8080/mystream
Server side NODE.JS
//operating system library. Used to get local IP address
var os = require("os");
//file system library. Used to load file stored inside back end server (https://nodejs.org/api/fs.html)
var fs = require("fs");
//http system library. Handles basic html requests
var http = require("http").createServer(http_handler);
//url library. Used to process html url requests
var url = require("url");
//Websocket
var io = require("socket.io")(http);
//Websocket used to stream video
var websocket = require("ws");
//-----------------------------------------------------------------------------------
// CONFIGURATION
//-----------------------------------------------------------------------------------
//Port the server will listen to
var server_port = 8080;
var websocket_stream_port = 8082;
//Path of the http and css files for the http server
var file_index_name = "index.html";
var file_css_name = "style.css";
var file_jsplayer_name = "jsmpeg.min.js";
//Http and css files loaded into memory for fast access
var file_index;
var file_css;
var file_jsplayer;
//Name of the local video stream
var stream_name = "mystream";
//-----------------------------------------------------------------------------------
// DETECT SERVER OWN IP
//-----------------------------------------------------------------------------------
//If just one interface, store the server IP Here
var server_ip;
//Get local IP address of the server
//https://stackoverflow.com/questions/3653065/get-local-ip-address-in-node-js
var ifaces = os.networkInterfaces();
Object.keys(ifaces).forEach
(
function (ifname)
{
var alias = 0;
ifaces[ifname].forEach
(
function (iface)
{
if ('IPv4' !== iface.family || iface.internal !== false)
{
// skip over internal (i.e. 127.0.0.1) and non-ipv4 addresses
return;
}
if (alias >= 1)
{
// this single interface has multiple ipv4 addresses
console.log('INFO: Server interface ' +alias +' - ' + ifname + ':' + alias, iface.address);
}
else
{
server_ip = iface.address;
// this interface has only one ipv4 adress
console.log('INFO: Server interface - ' +ifname, iface.address);
}
++alias;
}
);
}
);
//-----------------------------------------------------------------------------------
// HTTP SERVER
//-----------------------------------------------------------------------------------
// Fetch and serves local files to client
//Create http server and listen to the given port
http.listen
(
server_port,
function( )
{
console.log('INFO: ' +server_ip +' listening to html requests on port ' +server_port);
//Pre-load http, css and js files into memory to improve http request latency
file_index = load_file( file_index_name );
file_css = load_file( file_css_name );
file_jsplayer = load_file( file_jsplayer_name );
}
);
//-----------------------------------------------------------------------------------
// HTTP REQUESTS HANDLER
//-----------------------------------------------------------------------------------
// Answer to client http requests. Serve http, css and js files
function http_handler(req, res)
{
//If client asks for root
if (req.url == '/')
{
//Request main page
res.writeHead( 200, {"Content-Type": detect_content(file_index_name),"Content-Length":file_index.length} );
res.write(file_index);
res.end();
console.log("INFO: Serving file: " +req.url);
}
//If client asks for css file
else if (req.url == ("/" +file_css_name))
{
//Request main page
res.writeHead( 200, {"Content-Type": detect_content(file_css_name),"Content-Length" :file_css.length} );
res.write(file_css);
res.end();
console.log("INFO: Serving file: " +req.url);
}
//If client asks for css file
else if (req.url == ("/" +file_jsplayer_name))
{
//Request main page
res.writeHead( 200, {"Content-Type": detect_content(file_jsplayer_name),"Content-Length" :file_jsplayer.length} );
res.write(file_jsplayer);
res.end();
console.log("INFO: Serving file: " +req.url);
}
//Listening to the port the stream from ffmpeg will flow into
else if (req.url = "/mystream")
{
res.connection.setTimeout(0);
console.log( "Stream Connected: " +req.socket.remoteAddress + ":" +req.socket.remotePort );
req.on
(
"data",
function(data)
{
streaming_websocket.broadcast(data);
/*
if (req.socket.recording)
{
req.socket.recording.write(data);
}
*/
//console.log("broadcast: ", data.length);
}
);
req.on
(
"end",
function()
{
console.log("local stream has ended");
if (req.socket.recording)
{
req.socket.recording.close();
}
}
);
}
//If client asks for an unhandled path
else
{
res.end();
console.log("ERR: Invalid file request" +req.url);
}
}
//-----------------------------------------------------------------------------------
// WEBSOCKET SERVER: CONTROL/FEEDBACK REQUESTS
//-----------------------------------------------------------------------------------
// Handle websocket connection to the client
io.on
(
"connection",
function (socket)
{
console.log("connecting...");
socket.emit("welcome", { payload: "Server says hello" });
//Periodically send the current server time to the client in string form
setInterval
(
function()
{
socket.emit("server_time", { server_time: get_server_time() });
},
//Send every 333ms
333
);
socket.on
(
"myclick",
function (data)
{
timestamp_ms = get_timestamp_ms();
socket.emit("profile_ping", { timestamp: timestamp_ms });
console.log("button event: " +" client says: " +data.payload);
}
);
//"ArrowLeft"
socket.on
(
"keyboard",
function (data)
{
timestamp_ms = get_timestamp_ms();
socket.emit("profile_ping", { timestamp: timestamp_ms });
console.log("keyboard event: " +" client says: " +data.payload);
}
);
//profile packets from the client are answer that allows to compute roundway trip time
socket.on
(
"profile_pong",
function (data)
{
timestamp_ms_pong = get_timestamp_ms();
timestamp_ms_ping = data.timestamp;
console.log("Pong received. Round trip time[ms]: " +(timestamp_ms_pong -timestamp_ms_ping));
}
);
}
);
//-----------------------------------------------------------------------------------
// WEBSOCKET SERVER: STREAMING VIDEO
//-----------------------------------------------------------------------------------
// Websocket Server
var streaming_websocket = new websocket.Server({port: websocket_stream_port, perMessageDeflate: false});
streaming_websocket.connectionCount = 0;
streaming_websocket.on
(
"connection",
function(socket, upgradeReq)
{
streaming_websocket.connectionCount++;
console.log
(
'New websocket Connection: ',
(upgradeReq || socket.upgradeReq).socket.remoteAddress,
(upgradeReq || socket.upgradeReq).headers['user-agent'],
'('+streaming_websocket.connectionCount+" total)"
);
socket.on
(
'close',
function(code, message)
{
streaming_websocket.connectionCount--;
console.log('Disconnected websocket ('+streaming_websocket.connectionCount+' total)');
}
);
}
);
streaming_websocket.broadcast = function(data)
{
streaming_websocket.clients.forEach
(
function each(client)
{
if (client.readyState === websocket.OPEN)
{
client.send(data);
}
}
);
};
//-----------------------------------------------------------------------------------
// FUNCTIONS
//-----------------------------------------------------------------------------------
//-----------------------------------------------------------------------------------
// SERVER DATE&TIME
//-----------------------------------------------------------------------------------
// Get server time in string form
function get_server_time()
{
my_date = new Date();
return my_date.toUTCString();
}
//-----------------------------------------------------------------------------------
// TIMESTAMP
//-----------------------------------------------------------------------------------
// Profile performance in ms
function get_timestamp_ms()
{
my_date = new Date();
return 1000.0* my_date.getSeconds() +my_date.getMilliseconds()
}
//-----------------------------------------------------------------------------------
// FILE LOADER
//-----------------------------------------------------------------------------------
// Load files into memory for improved latency
function load_file( file_name )
{
var file_tmp;
var file_path = __dirname +"/" +file_name;
//HTML index file
try
{
file_tmp = fs.readFileSync( file_path );
}
catch (err)
{
console.log("ERR: " +err.code +" failed to load: " +file_path);
throw err;
}
console.log("INFO: " +file_path +" has been loaded into memory");
return file_tmp;
}
//-----------------------------------------------------------------------------------
// CONTENT TYPE DETECTOR
//-----------------------------------------------------------------------------------
// Return the right content type to give correct information to the client browser
function detect_content( file_name )
{
if (file_name.includes(".html"))
{
return "text/html";
}
else if (file_name.includes(".css"))
{
return "text/css";
}
else if (file_name.includes(".js"))
{
return "application/javascript";
}
else
{
throw "invalid extension";
}
}
Client Side html
<!DOCTYPE html>
<meta charset="utf-8"/>
<html>
<head>
<title>Maze Runner</title>
<link rel="stylesheet" href="style.css">
<script type="text/javascript" src="/socket.io/socket.io.js"></script>
<script type="text/javascript">
var host_ip = document.location.hostname;
console.log("connecting to host: ", host_ip);
//Get references to the html controls
textbox_input1 = window.document.getElementById("my_text_box")
//Connect to the server via websocket
var mysocket = io("http://" +host_ip +":8080");
//Long lived frame object
var last_frame;
//-----------------------------------------
// CONNESSION ACKNOWLEDGE
//-----------------------------------------
// Link is initiated by the client
// Server sends a welcome message when link is estabilished
// Server could send an auth token to keep track of individual clients and login data
mysocket.on
(
"welcome",
(message) =>
{
console.log("Server websocket connession acknoweldged... " +message.payload);
}
)
//-----------------------------------------
// SERVER->CLIENT CONTROLS
//-----------------------------------------
// Server can send an async message to dinamically update the page without reloading
// This is an example message with the server local date and time in string form
mysocket.on
(
"server_time",
(message) =>
{
fill_label( message.server_time );
console.log("Server sent his local time... " +message.server_time);
}
)
function fill_label( payload )
{
textbox_input1.value=payload;
}
//-----------------------------------------
// CLIENT->SERVER CONTROLS
//-----------------------------------------
// Controls inside the webpage can emit async events to the server
// In this example I have a push button and I catch keyboard strokes
//Handler for a pushbutton
function socket_button_handler()
{
mysocket.emit("myclick", { payload: "button was clicked" });
console.log("Button was clicked...");
}
//Listen for keystrokes
window.document.addEventListener
(
"keypress",
function onEvent(event)
{
//Inform the server that a key has been pressed
mysocket.emit("keyboard", { payload: event.key });
console.log("Key press...");
}
);
//-----------------------------------------
// PING-PONG
//-----------------------------------------
// Server sends ping messages with a timestamp
// Client answers with pongs to allow server to profile latency of the channel
//profile messages means the server wants to compute roundway trip
mysocket.on
(
"profile_ping",
(message) =>
{
//Answer back with the received timestamp so that server can compute roundway trip
mysocket.emit("profile_pong", { timestamp: message.timestamp });
console.log( "server wants a pong. server absolute timestamp[ms]: " +message.timestamp );
}
);
</script>
</head>
<body>
<h1>Html+Css Server +low latency Websocket server</h1>
<!-- button control with socket emitter as handler -->
<p> This button will emit a websocket event. The server will be informed in real time of the event. </p>
<button id="my_button" type="button" onclick="socket_button_handler()">Websocket Button!</button>
<!-- input text control -->
<p> This input can be filled through websockets directly by the server in real time </p>
<input id="my_text_box" type="text" value="" size="40">
<!-- canvas object, it's painted by the javascript video decoder -->
<p> This canvas is painted by the javascript player and shows the live stream.'</p>
<canvas id="video-canvas" width=640 height=480></canvas>
<!-- Javascript video decoder, take in a data stream from a websocket and paint on a canvas -->
<script type="text/javascript" src="jsmpeg.min.js"></script>
<script type="text/javascript">
var mycanvas = document.getElementById("video-canvas");
var url = "ws://" + host_ip +":8082/";
var player = new JSMpeg.Player(url, {canvas: mycanvas});
</script>
</body>
</html>
Javascript Player
You can get the javascript player I used from here:
https://github.com/phoboslab/jsmpeg/blob/master/jsmpeg.min.js
I’d like suggestions for NODE.JS packages or other solutions to provide a UDP H264 video stream that can be decoded by an HTML5 video tag with a target latency of 50ms.
That’s Almost certainly not possible in that configuration.
If you drop the video tag requirement, and use just straight WebRTC in the browser, you may be able to get down to about 150ms.

Why my property won't reflect iron-localstorage value?

I have a login system where I need to keee the logged in user on localstorage, and once the user logs I want to redirect him to his own page whenever he accesses the login page.
My template:
<iron-localstorage name="user-storage" value="{{storedUser}}"></iron-localstorage>
My object:
static get properties() {
return {
storedUser: {
type: Object,
notify: true
},
...
}
}
I want to do this:
redirect() {
console.log(this.storedUser);
if(this.storedUser) {
// Redirect user or admin to their own homescreen
if(this.storedUser.role == 'user')
this.set('route.path', '/homescreen-usuario');
else if(this.storedUser.role == 'admin')
this.set('route.path', '/homescreen-admin');
else
this.set('route.path', '/my-view404');
}
}
But the first line always logs "undefined".
You should use the 'on-iron-localstorage-load-empty' and 'on-iron-localstorage-load' events to react to the storage events, or you can use an observer on your 'storedUser' property.

Server-Sent Event is not working- Browsers can not listen any events of Server-Sent Event

I am implementing of server-sent event(HTML5) in my project without using node.js, It is just simple webpage(another JSP page) call and i get response from it. But when i get response from it but none of method/function(onopen,onmessage or onerror) is execute...
Client Code:
if (!!window.EventSource) {
var source=new EventSource("kitAvailCall.jsp");
source.onopen = function(){
alert("Kit is not available");
source.close();
};
source.onmessage=function(event){
console.log(event.data);
alert("Kit is not available");
}
source.onerror =function(){
console.log("EventSOurce: Getting error call");
alert("Kit is not available");
}
}
Server-side code:
try{
while(true) {
Thread.sleep(15000);
String IpAddress = (String) session.getAttribute("IPName");
boolean bool;
if(IpAddress != null && ((new Date()).getTime() - session.getLastAccessedTime())/1000 > 28){
bool = sample.pingToKit((String) session.getAttribute("IPName"));
System.out.println("Long polling request: "+bool);
//if bool is false then i want to quit loop and back to browser
if(bool == false){
response.setHeader("Content-Type","text/event-stream");
response.setHeader("Cache-Control", "no-cache");
out.print("data: " + bool);
out.flush();
break;
}
}
}
}catch(Exception e){
System.out.println("Going bad CONN:"+ e);
}
I don't know much of java, but i could conjecture that you might be listening to the same controller/jsp servlet(whatever routes data in java) which you made to send as streaming.
Listen on a page other than kitavailcall.jsp. This script is meant only for streaming, use another view/html page other than kitavailcall.jsp
The thing you have to understand is, SSE is a concurrent process, you create another page and run on that page with the javascript(client code) included in that page. And your server should support multi-threading as well.

Set FOSRestBundle Exception _format

I am using FOSRestBundle in my Symfony 2.3 project.
I am not able to set _format for response exceptions.
In my config.yml I have:
twig:
exception_controller: 'FOS\RestBundle\Controller\ExceptionController::showAction'
Default return is HTML format, but
is it possible to set _format = json to return exceptions?
I have more than one bundle, but only one is RestBundle, so other bundles need to set in normal way.
You can write your routes manually and set _format there like this:
acme_demo.api.user:
type: rest
pattern: /user/{username_canonical}.{_format}
defaults: { _controller: 'AcmeDemoBundle:User:getUser', username_canonical: null, _format: json }
requirements:
_method: GET
Edit: Or you can write your own exception handler and do with exceptions whatever you need to do:
// src/Acme/DemoBundle/EventListener/AcmeExceptionListener.php
namespace Acme\DemoBundle\EventListener;
use Symfony\Component\HttpKernel\Event\GetResponseForExceptionEvent;
use Symfony\Component\HttpFoundation\JsonResponse;
use Symfony\Component\HttpKernel\Exception\HttpExceptionInterface;
class AcmeExceptionListener
{
public function onKernelException(GetResponseForExceptionEvent $event)
{
// do whatever tests you need - in this example I filter by path prefix
$path = $event->getRequest()->getRequestUri();
if (strpos($path, '/api/') === 0) {
return;
}
$exception = $event->getException();
$response = new JsonResponse($exception, 500);
// HttpExceptionInterface is a special type of exception that
// holds status code and header details
if ($exception instanceof HttpExceptionInterface) {
$response->setStatusCode($exception->getStatusCode());
$response->headers->replace($exception->getHeaders());
}
// Send the modified response object to the event
$event->setResponse($response);
}
}
And register it as a listener:
# app/config/config.yml
services:
kernel.listener.your_listener_name:
class: Acme\DemoBundle\EventListener\AcmeExceptionListener
tags:
- { name: kernel.event_listener, event: kernel.exception, method: onKernelException }
How to create an Event Listener
The simplest method to catch symfony exceptions and return a json when a request is made to a FosRest Controller is the following:
# app/config/config.yml
fos_rest:
format_listener:
rules:
- { path: '^/api/', priorities: ['json', 'xml'] }
- { path: '^/', stop: true }

AS3 - Reliability of NetConnection calls using RTMFP / UDP?

My applet connects to FMS4.5 using RTMFP. I open one NetConnection and stream video from the server to the applet using a NetStream. When i want to execute a function on either the applet or the server i use a NetConnection.call();
My question is does AS3 do anything internally to make sure the call over UDP happens or is there a risk of UDP loss and the function never happening when it is called?
Assuming there might be times the UDP traffic is lost i made a responder retry onStatus. Does this code look like it would actually do the job or is it overkill?
// This is code on FMS to call a function in the applet.
var networkRetryMax = 30;
if (something == true) doSomethingInApplet(client, 0);
function doSomethingInApplet (client, retryCount) {
if (retryCount < networkRetryMax) {
retryCount++;
client.call("funcNameInApplet", new doSomethingInAppletResponder(client, retryCount), "foobar");
}
return;
}
function doSomethingInAppletResponder (client, retryCount) {
this.onResult = function() {...}
this.onStatus = function() {doSomethingInApplet(client, retryCount);}
return;
}
And...
// This is code in the applet.
// I return true to give it something to onResult or onStatus.
public static function funcNameInApplet(result:String):Boolean {
trace("Result: " + result);
return true;
}
So would this ensure the function is called over UDP / RTMFP?
Or is .call() reliability handled internally and this is unnecessary?