I'm having an issue with flash, which I am not really familiar with. I'm basing this code off of what came with the wowza media server in the video chat example, but unlike that example flash is not prompting me for whether or not to allow the video camera.
Below is my actionscript:
import flash.events.MouseEvent;
import flash.events.NetStatusEvent;
import flash.media.Camera;
import flash.media.Microphone;
import flash.media.Video;
import flash.net.NetConnection;
import flash.net.NetStream;
import flash.system.Security;
import flash.system.SecurityPanel;
import flash.display.Sprite;
import flash.text.TextField;
import flash.events.StatusEvent;
public class QandA extends Sprite {
Security.LOCAL_TRUSTED;
private var nc:NetConnection = null;
private var camera:Camera;
private var microphone:Microphone;
private var nsPublish:NetStream = null;
private var nsPlay:NetStream = null;
private var videoCamera:Video;
public var prompt:TextField;
public function QandA():void {
stage.align = "TL";
stage.scaleMode = "noScale";
videoCamera = new Video(160,120);
addChild(videoCamera);
camera = Camera.getCamera();
microphone = Microphone.getMicrophone();
if (camera.muted) {
trace("Camera Muted");
Security.showSettings(SecurityPanel.CAMERA);
camera.addEventListener(StatusEvent.STATUS, statusHandler);
} else {
startCamera();
}
}
private function statusHandler(e:StatusEvent):void {
if (e.code == "Camera.Unmuted") {
trace("Camera Unmuted");
startCamera();
camera.removeEventListener(StatusEvent.STATUS, statusHandler);
} else {
trace("StatusEvent: " + e.code + " " + e.toString());
}
}
private function startCamera():void {
// here are all the quality and performance settings that we suggest
camera.setMode(160, 120, 12, false);
camera.setQuality(0, 75);
camera.setKeyFrameInterval(24);
microphone.rate = 11;
microphone.setSilenceLevel(0);
nc = new NetConnection();
nc.connect("rtmp://localhost/live/");
// get status information from the NetConnection object
nc.addEventListener(NetStatusEvent.NET_STATUS, ncOnStatus);
}
private function nsPublishOnStatus(infoObject:NetStatusEvent):void
{
trace("nsPublish: "+infoObject.info.code+" ("+infoObject.info.description+")");
}
private function ncOnStatus(infoObject:NetStatusEvent):void
{
trace("nc: "+infoObject.info.code+" ("+infoObject.info.description+")");
nsPublish = new NetStream(nc);
nsPublish.addEventListener(NetStatusEvent.NET_STATUS, nsPublishOnStatus);
nsPublish.bufferTime = 0;
nsPublish.publish("testing");
// attach the camera and microphone to the server
nsPublish.attachCamera(camera);
nsPublish.attachAudio(microphone);
}
}
I'm fairly confident it's something simple; as I've seen this code in/on countless sites when discussing how to publish to a live server.
Any help would be greatly appreciated, I've attempted using this code on a webserver to see if it was simply local security settings, but that was not the case.
Logs I receive when debugging the application in Flash CS5:
Attempting to launch and connect to Player using URL D:\development\qanda\qandaHost.swf
[SWF] D:\development\qanda\qandaHost.swf - 3583 bytes after decompression
Camera Muted
nc: NetConnection.Connect.Success (Connection succeeded.)
nsPublish: NetStream.Publish.Start (Publishing testing.)
Below is wrong:
Security.showSettings(SecurityPanel.**CAMERA**);
You should write:
Security.showSettings(SecurityPanel.**PRIVACY**);
I wasn't attaching the camera to the video, thus I couldn't see myself -- even though the video was in fact streaming.
private function startCamera():void {
trace("Attempting to start camera");
// here are all the quality and performance settings that we suggest
camera.setMode(160, 120, 12, false);
camera.setQuality(0, 75);
camera.setKeyFrameInterval(24);
videoCamera.attachCamera(camera);
microphone.rate = 11;
microphone.setSilenceLevel(0);
}
Related
I've re-built an SWF audio uploader, and it was working fine for a few days. However, I just did a run-through of all the things I've built onto this project, and I notice that the POST request to the server to save the audio file is being Aborted about 10 seconds into the request. At 20 seconds, everything stops because of a timeout limitation that is implemented. I don't know much about AS3 or how it makes requests, but I do know that the PHP handler file is solid because no changes were made to it from the point which it worked last. The permissions are 755 and ownerships are correct on both the SWF and the handler file. I can also re-submit the request via Firebug, and it works with no issue whatsoever.
I'm not sure where the problem is stemming from exactly; whether it be browser, server or code issue. I was reading about Aborted requests and I've ensured that there are no other active requests before trying to upload. I should also add that other POST/GET requests have no issues, it's just this one request from the SWF.
Again, Flash/ActionScript is not a strength of mine, so if there are ways to improve what I'm doing, or if anyone can tell me what I'm doing wrong, please tell me.
package{
import flash.display.Sprite;
import flash.media.Microphone;
import flash.system.Security;
import org.bytearray.micrecorder.*;
import org.bytearray.micrecorder.events.RecordingEvent;
import org.bytearray.micrecorder.encoder.WaveEncoder;
import flash.events.MouseEvent;
import flash.events.Event;
import flash.events.ActivityEvent;
import fl.transitions.Tween;
import fl.transitions.easing.Strong;
import flash.net.FileReference;
import flash.net.URLLoader;
import flash.net.URLRequest;
import flash.net.URLRequestMethod;
import flash.display.LoaderInfo;
import flash.external.ExternalInterface;
import flash.media.Sound;
import org.as3wavsound.WavSound;
import org.as3wavsound.WavSoundChannel;
import com.adobe.serialization.json.JSON;
import com.adobe.serialization.json.JSONDecoder;
public class Main extends Sprite{
private var mic:Microphone;
private var requestor:URLLoader;
private var waveEncoder:WaveEncoder = new WaveEncoder();
private var recorder:MicRecorder = new MicRecorder(waveEncoder);
private var recBar:RecBar = new RecBar();
private var maxTime:Number = 30;
private var tween:Tween;
private var fileReference:FileReference = new FileReference();
private var tts:WavSound;
public function Main():void{
trace('recoding');
recButton.visible = false;
activity.visible = false ;
godText.visible = false;
recBar.visible = false;
mic = Microphone.getMicrophone();
mic.setSilenceLevel(5);
mic.gain = 50;
mic.setLoopBack(false);
mic.setUseEchoSuppression(true);
Security.showSettings("2");
requestor = new URLLoader();
addListeners();
}
private function addListeners():void{
recorder.addEventListener(RecordingEvent.RECORDING, recording);
recorder.addEventListener(Event.COMPLETE, recordComplete);
activity.addEventListener(Event.ENTER_FRAME, updateMeter);
//accept call from javascript to start recording
ExternalInterface.addCallback("startRecording", startRecording);
ExternalInterface.addCallback("stopRecording", stopRecording);
ExternalInterface.addCallback("sendFileToServer", sendFileToServer);
}
//external java script function call to start record
public function startRecording(max_time):void{
maxTime = max_time;
if(mic != null){
recorder.record();
ExternalInterface.call("$.audioRec.callback_started_recording");
}else{
ExternalInterface.call("$.audioRec.callback_error_recording", 0);
}
}
//external javascript function to trigger stop recording
public function stopRecording():void{
recorder.stop();
mic.setLoopBack(false);
ExternalInterface.call("$.audioRec.callback_stopped_recording");
}
public function sendFileToServer():void{
finalize_recording();
}
public function stopPreview():void{
//no function is currently available;
}
private function updateMeter(e:Event):void{
ExternalInterface.call("$.audioRec.callback_activityLevel", mic.activityLevel);
}
private function recording(e:RecordingEvent):void{
var currentTime:int = Math.floor(e.time / 1000);
ExternalInterface.call("$.audioRec.callback_activityTime", String(currentTime));
if(currentTime == maxTime ){
stopRecording();
}
}
private function recordComplete(e:Event):void{
preview_recording();
}
private function preview_recording():void{
tts = new WavSound(recorder.output);
tts.play();
ExternalInterface.call("$.audioRec.callback_started_preview");
}
//function send data to server
private function finalize_recording():void{
var _var1:String= '';
var globalParam = LoaderInfo(this.root.loaderInfo).parameters;
for(var element:String in globalParam){
if(element == 'host'){
_var1 = globalParam[element];
}
}
ExternalInterface.call("$.audioRec.callback_finished_recording");
if(_var1 != ''){
ExternalInterface.call("$.audioRec.callback_started_sending");
var req:URLRequest = new URLRequest(_var1);
req.contentType = 'application/octet-stream';
req.method = URLRequestMethod.POST;
req.data = recorder.output;
requestor.addEventListener(Event.COMPLETE, requestCompleteHandler);
requestor.load(req);
}
}
private function requestCompleteHandler(event:Event){
ExternalInterface.call("$.audioRec.callback_finished_sending", requestor.data);
}
private function getFlashVars():Object{
return Object(LoaderInfo(this.loaderInfo).parameters);
}
}
}
I am having no luck with trying to create a simple camera in actionscript. I don't want any controls- just a stage asking for permissions, then a livevideo of me in a window. Nothing fancy
Here's what I have so far: (the latest failure...)
package {
import flash.display.Sprite;
import flash.media.*;
public class FlashCamera extends Sprite
{
var cam:FlashCamera = Camera.getCamera();
var vid:Video = new Video();
vid.attachCamera(cam);
addChild(vid);
}
}
It's throwing this error when I try to compile this:
call to a possibly undefined method getCamera through a reference with static type Class
I'm compiling with flex in the windows command line like this:
(path to SDK)/bin/mxmlc Camera.as
Mind you, I am new to actionscript/flash development.
Can someone please explain what I am doing wrong?
For one thing, you're using classes of the name Camera from two different namespaces without disambiguating between the two. You'll probably also have to import other packages to support API versions of Camera and Video though (flash.media.Camera and flash.media.Video), but I'm not completely convinced this won't be done implicitly, especially not knowing the environment you're using.
Another thing you have to watch out for though, as far as realtime errors go, is when it takes the browser a few seconds to actually get the camera - just keep trying to grab it for at least a few seconds until it returns something other than null.
Found somethng that actually works, and will be adapting it for my needs:
package
{
import flash.display.Sprite;
import flash.events.NetStatusEvent;
import flash.net.NetConnection;
import flash.net.NetStream;
import flash.media.Camera;
import flash.media.Microphone;
import flash.media.Video;
public class FlashVideo extends Sprite
{
private var nc:NetConnection;
private var good:Boolean;
private var rtmpNow:String;
private var nsIn:NetStream;
private var nsOut:NetStream;
private var cam:Camera;
private var mic:Microphone;
private var vidLocal:Video;
private var vidStream:Video;
public function FlashVideo()
{
trace("Hello testing");
rtmpNow = "rtmp://localhost/LiveStreams";
nc=new NetConnection();
nc.connect(rtmpNow);
nc.addEventListener(NetStatusEvent.NET_STATUS,checkCon);
setCam();
setMic();
setVideo();
}
private function checkCon(e:NetStatusEvent):void
{
good = e.info.code == "NetConnection.Connect.Success";
if (good)
{
nsOut = new NetStream(nc);
nsOut.attachAudio(mic);
nsOut.attachCamera(cam);
nsOut.publish("left","live");
nsIn = new NetStream(nc);
nsIn.play("right");
vidStream.attachNetStream(nsIn);
}
}
private function setCam()
{
cam = Camera.getCamera();
cam.setKeyFrameInterval(9);
cam.setMode(640,400,30);
cam.setQuality(0,95);
}
private function setMic()
{
mic = Microphone.getMicrophone();
mic.gain = 85;
mic.rate = 11;
mic.setSilenceLevel(15,2000);
}
private function setVideo()
{
vidLocal = new Video(cam.width,cam.height);
addChild(vidLocal);
vidLocal.x = 15;
vidLocal.y = 30;
vidLocal.attachCamera(cam);
vidStream = new Video(cam.width,cam.height);
addChild(vidStream);
vidStream.x=(vidLocal.x+ cam.width +10);
vidStream.y = vidLocal.y;
}
}
}
I'm building an application with an Strobe Media Playback in it and I can't play a RTMP stream.
The weird thing here is that I can play the stream on JWPlayer and even in a simple Flash video player with the classic NetStream and NetConnection code.
http://www.longtailvideo.com/jw-player/wizard/
This is the stream: rtmp://fl.world-television.com/streamstudio/vod/flv:endesa/20130422/video_full_es_v2.flv
I don't know if I need to set up any special configuration through the OSMF params.
This is how I have been played the video in a simple Flash CS application.
package {
import flash.display.Sprite;
import flash.events.NetStatusEvent;
import flash.events.SecurityErrorEvent;
import flash.media.Video;
import flash.net.NetConnection;
import flash.net.NetStream;
import flash.events.Event;
public class VideoTest extends Sprite {
private var videoURL:String = "rtmp://fl.world-television.com/streamstudio/vod/flv:endesa/20130422/video_full_es_v2.flv";
private var connection:NetConnection;
private var stream:NetStream;
public function VideoTest()
{
initialize();
}
private function initialize():void
{
connection = new NetConnection();
connection.addEventListener(NetStatusEvent.NET_STATUS, netStatusHandler);
connection.addEventListener(SecurityErrorEvent.SECURITY_ERROR, securityErrorHandler);
connection.connect("rtmp://fl.world-television.com/streamstudio/vod/" );
}
private function netStatusHandler(event:NetStatusEvent):void {
trace("net status handler: " + event.info.code);
switch (event.info.code) {
case "NetConnection.Connect.Success":
connectStream();
break;
case "NetStream.Play.StreamNotFound":
trace("Stream not found: " + videoURL);
break;
}
}
private function securityErrorHandler(event:SecurityErrorEvent):void {
trace("securityErrorHandler: " + event);
}
private function connectStream():void
{
stream = new NetStream(connection);
stream.addEventListener(NetStatusEvent.NET_STATUS, netStatusHandler);
var video:Video = new Video();
video.attachNetStream(stream);
stream.play("endesa/20130422/video_full_es_v2");
addChild(video);
}
}
}
I really think that the problem here is that OSMF it's not slicing the URL correctly and it's trying to connect (NetConnection) or trying to play (NetStream) a bad String.
EDIT:
Solved removing "/vod" in the URL.
You could try it here:
http://osmf.org/dev/2.0gm/debug.html?wmode=direct&width=470&height=320&src=rtmp%3A%2F%2Ffl.world-television.com%2Fstreamstudio%2Fendesa%2F20130422%2Fvideo_full_es_v2
I know that there are many ways to play an FLV file but considering my project requirements, I need to play the flv using URLStream and NetStream
here's the complete sample code that I'm doing my tests on:
package
{
import flash.display.Sprite;
import flash.events.Event;
import flash.events.NetStatusEvent;
import flash.events.ProgressEvent;
import flash.utils.ByteArray;
import flash.net.URLRequest;
import flash.net.URLLoader;
import flash.net.URLLoaderDataFormat;
import flash.net.NetStreamAppendBytesAction;
import flash.net.NetConnection;
import flash.net.NetStream;
import flash.media.Video;
import flash.net.URLStream;
/**
* ...
* #author Hadi Tavakoli
*/
public class Main extends Sprite
{
private var netConnection:NetConnection;
private var netStream:NetStream;
private var ul:URLStream;
private var video:Video;
private var bytes:ByteArray = new ByteArray();
private var _isSeek:Boolean = false;
public function Main():void
{
if (stage) init();
else addEventListener(Event.ADDED_TO_STAGE, init);
}
private function init(e:Event = null):void
{
removeEventListener(Event.ADDED_TO_STAGE, init);
// entry point
video = new Video();
addChild(video);
netConnection = new NetConnection();
netConnection.addEventListener(NetStatusEvent.NET_STATUS, netConnectionStatusHandler);
netConnection.connect(null);
}
private function netConnectionStatusHandler(ev:NetStatusEvent):void
{
switch(ev.info.code)
{
case 'NetConnection.Connect.Success':
ul = new URLStream();
ul.addEventListener(ProgressEvent.PROGRESS, onProgress);
ul.load(new URLRequest('01.flv'));
break;
}
}
private function onProgress(e:ProgressEvent):void
{
ul.readBytes(bytes, bytes.length);
if (!netStream)
{
netStream = new NetStream(netConnection);
netStream.client = { };
video.attachNetStream(netStream);
netStream.play(null);
trace("BEGIN")
netStream.appendBytesAction(NetStreamAppendBytesAction.RESET_BEGIN);
}
else
{
if (!_isSeek)
{
trace("SEEK")
netStream.appendBytesAction(NetStreamAppendBytesAction.RESET_SEEK);
_isSeek = true;
}
}
if (bytes.length == e.bytesTotal)
{
trace("END")
netStream.appendBytesAction(NetStreamAppendBytesAction.END_SEQUENCE);
}
netStream.appendBytes(bytes);
trace("-")
}
}
}
I'm not sure if I am using "appendBytes" method correctly? the video is shown but only a very few first frames will play and then the video stops!
in my eyes it seems all ok! do you have any advice on where my problem is?
I don't think you need the if (!_isSeek) block. It looks like you are pushing the bytes as you receive them in a sequential order and so there's never a seek. It looks like it will push the first set of bytes and then append a seek action and append the rest of the bytes. Try just removing that block and see if it works.
Otherwise I think it's ok.
in "ul.readBytes(bytes, bytes.length);" line, there is a bug i guess. It's never worked for me also. It always return full length (from 0 to the available bytes). So It have a huge memory leak. But if you are using flash player 11.4 or later, you can change it like this.
ul.position = bytes.length;
ul.readBytes(bytes);
Is there a way you can use as3 to access iPad's camera?
I mean start the Camera within the app itself
have ability take a shoot and save the image to byteArray and applying the image to the background or doing some manipulation
I have done some research most of them just showing how to access the android devices.
Thanks for any suggestion or help.
Yes, you can absolutely do this. The beauty of Flash is that the code to do it is the same that you would use on Android or a PC.
Literally, you can do this to connect the camera to a Video object:
var camera:Camera = Camera.getCamera();
var video=new Video();
video.attachCamera(camera);
this.addChild(video); // 'this' would be a Sprite or UIComponent, etc...
There's a lot more to do if you want to do something useful, but it's fairly straight forward once you get started :)
bluebill1049, I'm not certain from the thread if you got what you were looking for, but I did see your request for the whole class. I found the same information (that Jason Sturges posted in his answer) in this post.
take photo using Adobe Builder (flex) for iOS
Unlike his reply here, his reply to that post had had a link to a great tutorial on building a mobile app and it was from that tutorial that this code was lifted/quoted. It requires an event class (event.CameraEvent - only a few lines) that's contained in that project/tutorial so it's important to be able to go back to the source, as it were. That source is located here:
http://devgirl.org/files/RIAUnleashed/
My thanks to Jason. Just so you don't have to dig, here's the event class that's missing from the quote:
package events
{
import flash.events.Event;
import flash.filesystem.File;
public class CameraEvent extends Event
{
public static const FILE_READY:String = "fileReady";
public var file:File;
public function CameraEvent(type:String, file:File=null, bubbles:Boolean = true, cancelable:Boolean = true)
{
super(type, bubbles, cancelable);
this.file = file;
}
}
}
Hope that helps!
Using the loader is not the only way to access the image bytes on iOS. It turns out the data is already in JPEG format to begin with, so encoding it again is not necessary.
Just do a mediaPromise.open() to get at the bytes and save them directly instead.
XpenseIt example code offers this camera implementation:
Class: CameraUtil:
package utils
{
import events.CameraEvent;
import flash.display.BitmapData;
import flash.display.Loader;
import flash.display.LoaderInfo;
import flash.events.Event;
import flash.events.EventDispatcher;
import flash.events.MediaEvent;
import flash.filesystem.File;
import flash.filesystem.FileMode;
import flash.filesystem.FileStream;
import flash.media.CameraRoll;
import flash.media.CameraUI;
import flash.media.MediaPromise;
import flash.media.MediaType;
import flash.utils.ByteArray;
import mx.events.DynamicEvent;
import mx.graphics.codec.JPEGEncoder;
[Event(name = "fileReady", type = "events.CameraEvent")]
public class CameraUtil extends EventDispatcher
{
protected var camera:CameraUI;
protected var loader:Loader;
public var file:File;
public function CameraUtil()
{
if (CameraUI.isSupported)
{
camera = new CameraUI();
camera.addEventListener(MediaEvent.COMPLETE, mediaEventComplete);
}
}
public function takePicture():void
{
if (camera)
camera.launch(MediaType.IMAGE);
}
protected function mediaEventComplete(event:MediaEvent):void
{
var mediaPromise:MediaPromise = event.data;
if (mediaPromise.file == null)
{
// For iOS we need to load with a Loader first
loader = new Loader();
loader.contentLoaderInfo.addEventListener(Event.COMPLETE, loaderCompleted);
loader.loadFilePromise(mediaPromise);
return;
}
else
{
// Android we can just dispatch the event that it's complete
file = new File(mediaPromise.file.url);
dispatchEvent(new CameraEvent(CameraEvent.FILE_READY, file));
}
}
protected function loaderCompleted(event:Event):void
{
var loaderInfo:LoaderInfo = event.target as LoaderInfo;
if (CameraRoll.supportsAddBitmapData)
{
var bitmapData:BitmapData = new BitmapData(loaderInfo.width, loaderInfo.height);
bitmapData.draw(loaderInfo.loader);
file = File.applicationStorageDirectory.resolvePath("receipt" + new Date().time + ".jpg");
var stream:FileStream = new FileStream()
stream.open(file, FileMode.WRITE);
var j:JPEGEncoder = new JPEGEncoder();
var bytes:ByteArray = j.encode(bitmapData);
stream.writeBytes(bytes, 0, bytes.bytesAvailable);
stream.close();
trace(file.url);
dispatchEvent(new CameraEvent(CameraEvent.FILE_READY, file));
}
}
}
}