Videojs-swf - rtmp connect without publishing stream or disconnect - actionscript-3

How can I set waiting when the stream is not published.
When unpublish (NetStream.Play.UnpublishNotify) set to waiting, and on publish (NetStream.Play.PublishNotify) continue play

I don't use Video-JS so I don't know what their code setup is doing.
Normally (without Video-JS) it's done something like this.
yourNS.addEventListener(NetStatusEvent.NET_STATUS, streamEventsHandler);
private function streamEventsHandler(evt:NetStatusEvent):void
{
//trace(evt.info.code); //if you need to check event status as text
if (evt.info.code == "NetStream.Play.UnpublishNotify") { yourNS.pause(); }
if (evt.info.code == "NetStream.Play.PublishNotify") { yourNS.resume(); }
}
So now in the Video-JS code, find the part that handles NetStream events, and edit the function to pause(); or resume();.
If unsure of the real NetStream name (I just called it yourNS for example) you can try :
Instead of : yourNS.pause(); try as evt.target.pause(); or evt.target.pause();
(PS: I use evt because of the name in function parameter is (evt:NetStatusEvent):void... Check your own Video-JS function)

Related

LIBGDX: How can i tell when a sound has finished playing?

The Sound API seems to be missing a function to indicate that a sound is finished playing. Is there some other way of finding out if the sound is done?
Not without submitting a patch to libgdx as far as I know the underlying Sound Backends for both OpenAL and Android don't even track the information internally, though the Music API has an isPlaying() function and getPosition() function as per the documentation.
just set this
sound.setLooping(false);
this way it will not run again and again.
and to check whether sound is playing or
not do this.
make a boolean variable
boolean soundplaying;
in render method do this
if(sound.isPlaying()){
soundplaying =true
}
and make a log
gdx.app.log("","sound"+soundplaying);
You can track this by storing the sound instance id that e.g. play() and loop() return. Example code:
private Sound sound;
private Long soundId;
...
public void startSound() {
if (soundId != null) {
return;
}
soundId = sound.loop(); // or sound.play()
}
public void stopSound() {
sound.stop(soundId);
soundId = null;
}
If you didn't want to have to call stopSound() from client code, you could just call it from startSound() instead of the return, to ensure any previous sound is stopped.

Adobe Cirrus Error on Direct Connect"Property startTransmit not found on flash.net.NetStream"

The error:
ReferenceError: Error #1069: Property startTransmit not found on flash.net.NetStream and there is no default value.
I've played around with cirrus plenty of times before and have yet to see this error before. But now I cant get it to go away.
My p2p Direct connect works great just fine. But every single time i see this error pop up. It throws an exception. I can't figure out where it's exactly happening.
Has anyone encountered this before? Any ideas where I should look?
Every client object needs to have the following functions defined.
client.stopTransmit=function($p1:*,$p2:*):void{
trace("stopTransmit called",$p1,$p2);
}
client.startTransmit=function():void{
trace("startTransmit called");
}
For example, set these in the onPeerConnect function:
sendStream.client = new Object();
sendStreamClient.onPeerConnect = function(subscriber:NetStream): Boolean{
var client:Object=new Object();
client.stopTransmit=function($p1:*,$p2:*):void{
trace("stopTransmit called",$p1,$p2);
}
client.startTransmit=function():void{
trace("startTransmit called");
}
subscriber.client=farStreamClient;
}
Additionally these need to be set on your sendStreamClient's client property:
sendStreamClient.client.stopTransmit=function($p1:*,$p2:*):void{
trace("stopTransmit called",$p1,$p2);
}
sendStreamClient.client.startTransmit=function():void{
trace("startTransmit called");
}
And they need to be set on your recieveStreamClient's client property.
On the server side script, you probably (or somebody else) have set up the application, so that it calls back a function -this time it is startTransmit-, and it isn't handled on the client side.
Remove the code from the server, or add a default value, or add a function to your code.
In my similar program, i had to add the function to my code (in my case it was not 'startTransmit') :
if ("NetConnection.Connect.Success" == e.info.code) {
netConnection.client=new Object();
netConnection.client.startTransmit=startTransmit; //no columns!
}
where startTransmit is
private function startTransmit():Boolean{
return true;
}
Are you sending h264 videos? I think it is to do with that...
If you add
public function startTransmit($p1:*,$p2:*):void{
}
public function stopTransmit():void{
}
where you have your media server connection it should work fine, at least it does for me :)
There is another netstream other than receiveStream and sendStream. You should set startTransmit and stopTransmit functions on the callerns netstream, something like this:
sendStreamClient.onPeerConnect = function(callerns:NetStream): Boolean{
var farStreamClient:Object=new Object();
farStreamClient.stopTransmit=function($p1:*,$p2:*):void{
trace("-------------farStream stopTransmit called!",$p1,$p2);
}
farStreamClient.startTransmit=function():void{
trace("-------------farStream startTransmit called!");
}
callerns.client=farStreamClient;
}
The problem is not on AMS or Red5 server. Even transmitting a video on P2P from an Android device triggers the same error.
The solution worked.
Actually the stopTransmit() sends a Boolean and an integer.
It would be amazing to know what they mean.
I have opened a bug on adobe bugbase in order to have it documented or removed.
Please vote:
https://bugbase.adobe.com/index.cfm?event=bug&id=3844856

Using AS3WavSound to play WAV - cannot stop instantly

I'm using the AS3WavSound (http://code.google.com/p/as3wavsound/) class to playback externally loaded wavs. This is working successfully. The library is simple and effective.
After decoding the Wav ByteArray the method the library employs for playback is using the SampleDataEvent.SAMPLE_DATA event and then writing the mixed samples to the output stream.
player.addEventListener(SampleDataEvent.SAMPLE_DATA, onSamplesCallback);
private function onSamplesCallback( evt : SampleDataEvent ):void
{
for (var i:int = 0; i < samplesLength; i++)
{
if(_mute == false){
outputStream.writeFloat(samplesLeft[i]);
outputStream.writeFloat(samplesRight[i]);
}
}
}
My problem is that I need to silence this audio output immediately but whatever method I have tried there is a distinct (1 second approx) delay before the silence takes effect.
As you can see I've attempted to add a boolean to block any samples being written to the output stream but this has had no effect on the delay.
My suspicion is that this is a fundamental part of how the samples are buffered and then written out. Essentially by the time a user action on screen (clicking a mute button) has been called and the _mute boolean is set to true there are already samples waiting to be written to the output that cannot be affected.
Any advice or confirmation of my suspicion would be greatly appreciated.
Thanks,
gfte.
Your suspicion is probably right - but why stop it at that level? If you want to turn off the sound, would it not be better to set the volume on the soundTransform-property on the SoundChannel-object returned by the play method? (I assume the wav library returns this in some way)
It looks like the library you are using has a similar design to the native Flash Sound API wherein a SoundChannel object is returned from the play() method. This SoundChannel instance has a stop() method which should stop the sound right away.
var sound:WavSoundPlayer = new WavSoundPlayer();
var channel:WavSoundChannel = new WavSoundChannel();
sound.addEventListener( SampleDataEvent.SAMPLE_DATA, onSampleData );
channel = sound.play();
private function onSamplesData( evt : SampleDataEvent ):void
{
for (var i:int = 0; i < samplesLength; i++)
{
outputStream.writeFloat(samplesLeft[i]);
outputStream.writeFloat(samplesRight[i]);
}
}
channel.stop()
The _mute variable in your example will only be able to change either before or after the loop, not while it is looping.

Selecting Input TextField throws Security sandbox violation in loaded swf in Adobe AIR

I have a swf, loaded into the non-application sandbox in Adobe AIR 1.5 (the shell is already installed with our users so I can't update to version 2+).
On the stage in the swf are buttons, movieclips, animations etc - all of these work fine.
When we add an input TextField, selecting this TextField causes a Security Sandbox Violation.
Error message (in debug mode) is (I've edited the actual file names):
[trace] *** Security Sandbox Violation ***
[trace] SecurityDomain 'file:///path/to/local/loaded.swf' tried to access incompatible context 'app:/loadingApp-debug.swf'
The user then is unable to enter text into the TextField. The rest of the application is unaffected.
The FDB stacktrace only shows:
this = [Object 57216577, class='flash.utils::Timer'].Timer/tick() at <null>:0
Has anyone got a workaround for this?
I'm guessing it's either the TextField attempting to access the stage, or an event attempting to bubble / access global properties.
I understand the air sandbox restrictions and use them daily - with sandboxBridges from parent to child and child to parent etc - perhaps there is something I need to expose to allow this to work?
Any clues?
Edit:
I've now tracked down the problem to being that the TextField attempts to do
this.stage.focus = this;
or something equivalent when MouseDown happens.
It also appears that there is no access to KeyboardEvents in loaded swfs, so my thought of making the 'field' a button and then controlling input by listening to KeyboardEvents is dead in the water.
Now looking at whether to relay events to callbacks passed through the parent sandbox bridge, or whether minimal comps might save my butt.
Ok, I have an insane workaround, but it's pretty solid. I'm going to post it almost in full here, though I'll probably make it generic and upload it to github at some point.
In my shell, I have a view-with-mediator (I'm using robotlegs) which I'm calling EventRelayer and EventRelayerMediator.
The view's only purpose is to give the mediator access to the stage.
I exposed some functions on the parentSandboxBridge:
public function requestKeyboardEventRelay(eventType:String, callback:Function):void;
public function requestMouseEventRelay(eventType:String, callback:Function):void;
public function cancelKeyboardEventRelay(eventType:String, callback:Function):void;
public function cancelMouseEventRelay(eventType:String, callback:Function):void;
My sandbox bridges always just translate into strong typed events, so these fire events like:
RelayEvent(RelayEvent.START_RELAY_REQUESTED, KeyboardEvent, eventType, callback);
RelayEvent(RelayEvent.CANCEL_RELAY_REQUESTED, MouseEvent, eventType, callback);
These are picked up by the EventRelayerMediator, and translated into handlers in an eventMap:
override public function onRegister():void
{
createRelayHandlerFactories();
eventMap.mapListener(eventDispatcher, RelayEvent.START_RELAY_REQUESTED, startRelay);
}
protected function startRelay(e:RelayEvent):void
{
var handler:Function = createRelayHandler(e.relayEventClass, e.callback);
eventMap.mapListener(view.stage, e.relayEventType, handler, e.relayEventClass);
}
protected function createRelayHandler(relayEventClass:Class, callback:Function):Function
{
var handler:Function = relayHandlerFactoriesByEventClass[relayEventClass](callback);
return handler;
}
protected function createRelayHandlerFactories():void
{
relayHandlerFactoriesByEventClass = new Dictionary();
relayHandlerFactoriesByEventClass[KeyboardEvent] = createKeyboardEventRelayHandler;
relayHandlerFactoriesByEventClass[MouseEvent] = createMouseEventRelayHandler;
}
protected function createKeyboardEventRelayHandler(callback:Function):Function
{
var handler:Function = function(e:KeyboardEvent):void
{
trace("Relaying from shell: " + e.toString());
// passing an object because the sandbox bridge doesn't allow strong typed values, only primitives
var o:Object = {};
o.type = e.type;
o.charCode = e.charCode;
o.keyCode = e.keyCode;
o.altKey = e.altKey;
o.ctrlKey = e.ctrlKey;
o.shiftKey = e.shiftKey;
// no point adding other props as we can't pass them
// to the constructor of the KeyboardEvent
callback(o)
}
return handler;
}
The loaded swf passes a callback which just re-assembles and re-dispatches the events.
My input TextField is now just a dynamic field with a click handler that activates listening for keyboard events on the root of the swf, and then updates the dynamic field accordingly.
At the moment that is super-crude but I'll break it out into a robust, tested class now I know it works.
I've used a dictionary to manage the handlers because I'm sure that memory leakage hell is about to follow and I'm expecting to have to relay the FocusEvents to stop entering text.
I need to test memory leakage, return a binding from the parentSandboxBridge function so that I can make sure I don't add the same handler twice etc etc, but Adobe - you suck for not calling this out and providing a built in relay mechanism.

Can I specify a delay before the browser raises "rollover" event?

I am working on an ASP.NET web application that is required to bring up a popup on a roolover. I am using the "OnMouseOver" event and it works as expected. The problem is that the event is on a "hair trigger"; even a casual passage of the mouse over the control brings up the popup (which then must be manually dismissed). I want to add a delay so that a rapid pass over the control in question does not trigger the event. Is there a way to set such a delay or is there a different event that I could use to get the same "trigger event on a slow rollover"?
One solution that comes to mind, there may be better ways though:
Make the onmouseover call the function via a setTimeout delay
Inside the function, check the mouse is actually over that element.
You could also use an onmouseout to clear the setTimeout, but then you'd have to store a reference to the timer in a global variable to get at it again.
What I ended up doing is as follows (oRow is a table row but it could be any control):
function ItemMouseOver(oRow, "parameters for the popup")
{
oRow.showTimer = window.setTimeout(function()
{
alert('popup');
}, 1000);
}
function ItemMouseOut(oRow)
{
if (oRow.showTimer)
window.clearTimeout(oRow.showTimer);
In the ASP.NET grid view RowDataBound event: I added the following code:
protected void ReportGridView_RowDataBound(object sender, GridViewRowEventArgs e)
{
if (e.Row.RowType == DataControlRowType.DataRow && (
e.Row.RowState == DataControlRowState.Normal
|| e.Row.RowState == DataControlRowState.Alternate))
{
// get the input values for the popup for the row (stuff deleted)
e.Row.Attributes["onmouseover"] = "javascript:ItemMouseOver(this,
"parameters for the popup");";
e.Row.Attributes["onmouseout"] = "javascript:ItemMouseOut(this);";
}
}
It works just fine. Thanks.