How to show the song download represented by duration time of it - actionscript-3

What i'm trying to do is to show the song download progress in form of song duration time. For example: 00:00, 01:05, 02:14, 03:58, .... 04:13 being 04:13 the song total duration. So far i have this code:
var soundClip:Sound;
var sTransform:SoundTransform = new SoundTransform(0.1);
function init() {
soundClip = new Sound();
soundClip.load(new URLRequest("magneto.mp3"));
//soundClip.load(new URLRequest("making.mp3"));
soundClip.addEventListener(Event.COMPLETE, soundLoaded);
soundClip.addEventListener(ProgressEvent.PROGRESS, soundLoading);
}
init();
function convertTime(millis:Number):String{
var displayMinutes:String;
var displaySeconds:String;
var Minutes:Number = (millis % (1000*60*60)/(1000*60));
var Seconds:Number = ((millis % (1000*60*60)) % (1000*60))/1000;
if(Minutes<10){
displayMinutes = "0"+Math.floor(Minutes);
}else{
displayMinutes = Math.floor(Minutes).toString();
}
if(Seconds<10){
displaySeconds = "0"+Math.floor(Seconds);
}else{
displaySeconds = Math.floor(Seconds).toString();
}
return displayMinutes + ":" + displaySeconds;
}
function soundLoaded(e:Event) {
soundClip.play(0,0,sTransform);
}
function soundLoading(e:ProgressEvent) {
trace(convertTime(soundClip.length));
}
As you can see, i'm testing it out with two songs, according to the code above the duration time of both are: 03:52 and 11:28 but according to the window these two songs last 03:52 and 05:44. Here is the code and both mp3 files.
Thank you.
EDIT:I'm analizing this page wicht play the song making.mp3, after debbuging it i realized that there is a value wicht is passed to the player, and go this way: 0, 0, 2664, 7576,...344370 these values are shown as*00:00, 01:05, 02:14, 03:58, .... 04:13* as the download progress. Knowing where this data come from would solve my problem, initially i thought it would be obtein through length propety but this only worked well for the magneto.mp3 file not for both songs.
On the whole i want to show:
00:00, 00:23, 01:23...03:57(where 03:57 is the duration time of any song) as the download progress.
Thank you for helping me. Cheers :)

Your code has no problems and your technique is correct.
You only need to fetch the total duration at the end of the download. The value of the file length changes as more data is retrieved. If your download stops before reaching the end, you will only have the length of the incomplete file. Add another handler possibly to check for errors, and if fired, let the user know that the file download is still incomplete.
Update: I figured it out. Add a call to convertTime() in the soundLoaded() method.
What is happening is that the file length is being updated in the PROGRESS event handler. But the final length is often only available in the COMPLETE event, because the PROGRESS event handler is called only when the file download is incomplete and not after it is ready.
Keep the convertTime() call in the PROGRESS event handler as you do presently.
private function soundLoaded(e:Event):void
{
soundClip.play(0, 0, sTransform);
trace(convertTime(soundClip.length));
}
This should do it.
Update 2: This is a known issue reported online at many forums. The length of any sound file sampled at less than 44 khz is reported incorrectly while the download is in progress. It's only after the download completes that the correct duration is reported. This only affects SWF files version 9 or less.
Changing the output SWF to version 10+ fixes the issue.

Related

onActivityResult crashes when custom location for video (MediaStore.EXTRA_OUTPUT, URI) is set

So this is my setup for the 'intent':
Intent cameraACTION_VIDEO_CAPTURE = new Intent(MediaStore.ACTION_VIDEO_CAPTURE);
tempUri = accessLocalStorage.getThisAppsStorageUriPath();
//Crashed for tempUri = "/data/user/0/hardy.android.go/app_files/test.mp4"
//Crashed for tempUri = "/data/user/0/hardy.android.go/app_files/"
cameraACTION_VIDEO_CAPTURE.putExtra(MediaStore.EXTRA_OUTPUT, tempUri);
cameraACTION_VIDEO_CAPTURE.setFlags(cameraACTION_VIDEO_CAPTURE.getFlags() | Intent.FLAG_ACTIVITY_NO_HISTORY);
startActivityForResult(cameraACTION_VIDEO_CAPTURE,
Integer.parseInt( DataModel.SETVIDEORECORDING.toString()));
Video intent starts as expected and crashes once I finish recording - it doesn't even make it to 'onActivityResult'. Error is:
java.lang.NullPointerException: Attempt to invoke virtual method 'int android.graphics.Bitmap.getWidth()' on a null object reference
Don't know why there is a Bitmap floating around in there?
Anywayz, so in an attempt to try and pinpoint the issue, I comment out the following and go again:
cameraACTION_VIDEO_CAPTURE.putExtra(MediaStore.EXTRA_OUTPUT, tempUri);
and it works :( - video is stored here:
/storage/emulated/0/DCIM/Camera/VID_20181004_213440310_HDR.mp4
Ok, I have a partial answer. In terms of the application not crashing, I made progress by using FileProvider to generate the Uri.
tempUri = FileProvider.getUriForFile(
context,
BuildConfig.APPLICATION_ID + ".provider",
new File(accessLocalStorage.getThisAppsStorageUriPath().getPath())
);
However, the video File saved at the path in the Uri was of size/length 0 () and I didn't have time to work through this and so, this story ends here :( - hope this is of some help for others!

How to write a code in Android to send waypoints to my 3DRobotics drone?

Good afternoon.
At the moment I am trying to write the code in the "Main Activity" to send some waypoints to my IRIS drone but it is only working when the points are five. Could you check my code and give me suggestions about what is happening and how can I send more waypoints to my drone? I really appreciate your help because I am new developing in Android:
Code:
public void onBtnConnectTap3(View view) {
if (this.drone.isConnected()) {
this.drone.disconnect();
} else {
Spinner connectionSelector = (Spinner) findViewById(R.id.selectConnectionType);
int selectedConnectionType = connectionSelector.getSelectedItemPosition();
Bundle extraParams = new Bundle();
if (selectedConnectionType == ConnectionType.TYPE_USB) {
extraParams.putInt(ConnectionType.EXTRA_USB_BAUD_RATE, DEFAULT_USB_BAUD_RATE); // Set default baud rate to 57600
} else {
extraParams.putInt(ConnectionType.EXTRA_UDP_SERVER_PORT, DEFAULT_UDP_PORT); // Set default baud rate to 14550
}
ConnectionParameter connectionParams = new ConnectionParameter(selectedConnectionType, extraParams, null);
this.drone.connect(connectionParams);
}
currentMission = new Mission();
currentMission.clear();
for (int i = 1; i < 20; i++) {
waypoint2=new Waypoint();
yaw=new YawCondition();
waypoint2.setCoordinate(new LatLongAlt( i, i, i));
yaw.setAngle(i);
missionI3 = waypoint2;
currentMission.addMissionItem(missionI3);
missionI2=yaw;
currentMission.addMissionItem(missionI2);
}
this.drone.generateDronie();
this.drone.setMission(currentMission, true);
this.drone.arm(true);
}
Dependencies in Build.gradle:
dependencies {
compile fileTree(dir: 'libs', include: ['*.jar'])
compile 'com.android.support:appcompat-v7:22.1.1'
compile 'com.o3dr.android:dronekit-android:2.3.11'
}
I would like to know if you also know where I can keep learning about how to develop apps in Android for 3DRobotics drones taking in consideration that my main sources are: http://android.dronekit.io/first_app.html and http://android.dronekit.io/javadoc/
Thanks in advance for your answer.
I'm not completely sure what you are trying to accomplish, but I see some possible errors in your code.
Use the latest of dronekit-android. The current version is 2.7.0. You can keep up to date on the versions here https://bintray.com/3drobotics/maven/dronekit-android/view
You are generating a mission with 38 items (19 waypoints, and 19 yaws). You are doing a very unsafe thing by setting waypoint coordinates to 1,1,1 ... 19,19,19. You vehicle will fly somewhere I assume you didn't intend.
I'm unsure why you have generateDronie(). As per the docs
Generate action to create a dronie mission, and upload it to the connected drone.
A dronie is a specific type mission that will fly a selfie path.
setMission() is correct. However, the last step in your code is to arm the vehicle. You will need to tell the drone to actually run the mission. You can do this with the startMission() method in the MissionApi class.
Be careful setting and starting mission with the same user interaction. There is always the chance that setMission() will fail to upload to the vehicle. If this is the case, startMission() will run the last mission that was successfully uploaded to the vehicle.
You can verify the upload succeeded by listening for the broadcast AttributeEvent.MISSION_SENT.
You can always contribute to the documentation by adding javadocs to APIs that you feel are missing or need clarification.

AMS doesn't receive unpublish command SOMETIMES over rtmpt

This one has had me going for a week at least. I am trying to record a video file to AMS. It works great almost all of the time, except about 1 in 10 or 15 recording sessions, I never receive 'NetStream.Unpublish.Success' on my netstream from AMS when I close the stream. I am connecting to AMS using rtmpt when this happens, it seems to work fine over rtmp. Also, it seems like this only happens in safari on mac, but since its so intermittent I don't really trust that. Here is my basic flow:
// just a way to use promises with netStatusEvents
private function netListener(code:String, netObject:*):Promise {
var deferred:Deferred = new Deferred();
var netStatusHandler:Function = function (event:NetStatusEvent):void {
if (event.info.level == 'error') {
deferred.reject(event);
} else if (event.info.code == code) {
deferred.resolve(netObject);
// we want this to be a one time listener since the connection can swap between record/playback
netObject.removeEventListener(NetStatusEvent.NET_STATUS, netStatusHandler);
}
};
netObject.addEventListener(NetStatusEvent.NET_STATUS, netStatusHandler);
return deferred.promise;
}
// set up for recording
private function initRecord():void {
Settings.recordFile = Settings.uniquePrefix + (new Date()).getTime();
// detach any existing NetStream from the video
_view.video.attachNetStream(null);
// dispose of existing NetStream
if (_videoStream) {
_videoStream.dispose();
_videoStream = null;
}
// disconnect before connecting anew
(_nc.connected ? netListener('NetConnection.Connect.Closed', _nc) : Promise.when(_nc))
.then(function (nc:NetConnection):void {
netListener('NetConnection.Connect.Success', _nc)
.then(function (nc:NetConnection):void {
_view.video.attachCamera(_webcam);
// get new NetStream
_videoStream = getNetStream(_nc);
ExternalInterface.call("CTplayer." + Settings.instanceName + ".onRecordReady", true);
}, function(error:NetStatusEvent):void {
ExternalInterface.call("CTplayer." + Settings.instanceName + ".onError", error.info);
});
_nc.connect(Settings.recordServer);
}); // end ncClose
if (_nc.connected) _nc.close();
}
// stop recording
private function stop():void {
netListener('NetStream.Unpublish.Success', _videoStream)
.then(function (ns:NetStream):void {
ExternalInterface.call("CTplayer." + Settings.instanceName + ".onRecordStop", Settings.recordFile);
});
_videoStream.attachCamera(null);
_videoStream.attachAudio(null);
_videoStream.close();
}
// start recording
private function record():void {
netListener('NetStream.Publish.Start', _videoStream)
.then(function (ns:NetStream):void {
ExternalInterface.call("CTplayer." + Settings.instanceName + ".onRecording");
});
_videoStream.attachCamera(_webcam);
_videoStream.attachAudio(_microphone);
_videoStream.publish(Settings.recordFile, "record"); // fires NetStream.Publish.Success
}
Update
I am now using a new NetConnection per connection attempt and also not forcing port 80 (see my 'answer' below). This has not solved my connection woes, only made the instances more infrequent. Now like every week or so I still have some random failure of ams or flash. Most recently someone made a recording and then flash player was unable to load the video for playback. The ams logs show a connection attempt and then nothing. There should at least be a play event logged for when i load the metadata. This is quite frustrating and impossible to debug.
I would try 2 distinct NetConnection objects, one for record and one for replay. This will remove your complexities around listeners adding/removing and connect/reconnect/disconnect logic and would IMO be cleaner.
NetConnections are cheap, and I've always used one per task at hand. The other advantage is that you can connect both at startup so the replay connection is ready instantly.
I've not seen a Promise used here before, but I'm not qualified to comment if that may cause a problem or not.
I think my issue was connecting over port 80. I originally thought I had to use port 80 with rtmpt, so I set my Settings.recordServer variable to rtmpt://myamsserver.net:80/app. I'm now using a shotgun approach where I try a bunch of port/protocol combos at once and pick the first one to connect. It is almost always picking port 443 over rtmpt, which seems much faster and more stable all around than 80, and I haven't had this issue since. It could also be due to not reusing the same NetConnection object like Stefan suggested, its hard to say.

How to browse mobile directory in flex?

I have captured 3 videos on my mobile which is by default stored on the phone gallery (Gallery/videos/). I have to play these 3 videos in one of my flex mobile application. How can I get the videos to the flex project? if I need to browse the mobile directory means kindly help me with some code to do so.
I too am looking for an answer to this question. Right now, based on other Stackoverflow discussions, exhaustive perusal of tutorials and Adobe documentation, and comments to both (often the more useful resource), I'm coming to the conclusion that it's not possible.
you can use CameraRoll.browseForImage() and open the iOS gallery of photos to see all entities of MediaType.IMAGE, but it will not show you MediaType.VIDEO
you can use CameraUI to launch the system camera by delegation and that returns a MediaPromise, but as far as I can tell, it does not save the video you capture anywhere, and I cannot find a way to access the captured video using the MediaPromise (at least using the Loader class)
Here's my code as a hint in that direction. The second code block is using the CameraRoll to browseForImage() but there is no browseForVideo() in the API.
if(CameraUI.isSupported)
{
camera = new CameraUI();
camera.addEventListener(MediaEvent.COMPLETE, videoMediaEventComplete);
camera.addEventListener(Event.CANCEL, cameraCanceled);
camera.addEventListener(ErrorEvent.ERROR, cameraError);
camera.launch(MediaType.VIDEO);
}
else
{
statusText.text = "Camera not supported on this device.";
startTimer();
}
if (CameraRoll.supportsBrowseForImage)
{
roll = new CameraRoll();
roll.addEventListener(MediaEvent.SELECT, cameraRollEventComplete);
roll.addEventListener(Event.CANCEL, cameraCanceled);
roll.addEventListener(ErrorEvent.ERROR, cameraError);
roll.browseForImage();
}
else
{
statusText.text = "Camera roll not supported on this device.";
startTimer();
}
I've since found that Videos captured using the delegated system camera are stored in a temporary storage location that iOS -DOES!- allow access to. (I was pleasantly shocked.)
The Captured video is not added to the device's Camera Roll as other videos captured using the iOS System Camera app, so it's not enough to capture video and expect to be able to access it later (if, for instance, CameraRoll.browseForVideo() is ever added to the API.
Therefore, you have to 'get while the getting is good' and move the file from the temporary storage location to some non-volatile location such as ApplicationStorageDirectory or the user's Documents directory (The only options in iOS I think).
The MediaPromise... I think... is completely useless for accessing the video via any direct progressive loader/streamer method, but still provides the location/url/path/filename of the temporary file so you can perform File operations on it.
Ironic that there are tutorials for getting around the lack of a file location/url/path/filename in the MediaPromise when using CameraRoll.browseForImage()... and that method is to use a loader class to load the image content (which you can then write out to a file), but when taking video, the video content is not accessible, and instead a file location/url/path/filename is provided. Ironic that there are nearly no resources I was able to find to help with this also. grumble
I'm going to include some code chunks w/o really editing them to strip out extraneous bits because it's way past when I need to be in bed, but I wanted you to have this. I may come clean it up later.
This section is in a Spark SkinnablePopUpContainer and I use the same click event for several buttons, thus the below 'case' is in the switch-case in that event handler function.
In case you are not familiar, the 'close(true, data)' is the method to close the SkinnablePopUpContainer, tell the parent/owner that the container was closed purposefully and that it should look for the data object being shared back (i.e., there are changes to be 'commit'ed).
case "cameraVideo":
{
if(CameraUI.isSupported)
{
camera = new CameraUI();
camera.addEventListener(MediaEvent.COMPLETE, videoMediaEventComplete);
camera.addEventListener(Event.CANCEL, cameraCanceled);
camera.addEventListener(ErrorEvent.ERROR, cameraError);
camera.launch(MediaType.VIDEO);
}
else
{
statusText.text = "Camera not supported on this device.";
startTimer();
}
break;
}
protected function cameraCanceled(event:Event):void
{
statusText.text = "Camera access canceled by user.";
startTimer();
}
protected function cameraError(event:ErrorEvent):void
{
statusText.text = "There was an error while trying to use the camera.";
startTimer();
}
protected function videoMediaEventComplete(event:MediaEvent):void
{
statusText.text="Preparing captured video...";
camera.removeEventListener(MediaEvent.COMPLETE, videoMediaEventComplete);
camera.removeEventListener(Event.CANCEL, cameraCanceled);
camera.removeEventListener(ErrorEvent.ERROR, cameraError);
var media:MediaPromise = event.data;
data.MediaType = MediaType.VIDEO;
data.MediaPromise = media;
data.source = "camera video";
close(true,data)
}
This section is the Actionscript in the close handler of the parent/owner of the SkinnablePopUpContainer (truncated once the useful code is included)
private function choosePictureLightboxClosed(event:PopUpEvent):void
{
imageButtonsActive = false;
if(event.commit)
{
this.data = event.data as Object;
filters = new Array();
selection = true;
switch(data.MediaType)
{
case MediaType.VIDEO:
{
mediaType = "video";
trace(data.MediaPromise.file.url + " - " + data.MediaPromise.relativePath + " - " +data.MediaPromise.mediaType);
var sourceFile:File = new File(data.MediaPromise.file.url);
var destinationFile:File = File.applicationStorageDirectory.resolvePath("User" +parentApplication.userid);
if(destinationFile.exists && !destinationFile.isDirectory)
{
destinationFile.deleteFile();
}
destinationFile.createDirectory();
destinationFile = destinationFile.resolvePath("Videos");
if(destinationFile.exists && !destinationFile.isDirectory)
{
destinationFile.deleteFile();
}
destinationFile.createDirectory();
destinationFile = destinationFile.resolvePath(parentApplication.userid+"Video"+new Date().getTime()+".mov");
trace(destinationFile.nativePath);
sourceFile.moveTo(destinationFile,true);
break;
}
I sure do hope this helps. This has been a very frustrating (and costly in terms of our project being government grant funded and having deadlines we utterly failed to meet), and I very much hope that these hard-won solutions might help others avoid the same experience.

Flash Builder will not read local JSON file . .

So I've tried to build a small utility to view the contents of a JSON file in an easy-to-understand manner (for non-tech people).
I have Googled far and wide, high and low, but every example that shows how to consume a JSON file in Flash Builder uses the HTTP service, pointing to a file on the web.
Here I am, sitting in front of my MacBook, wondering why I can't make this work. In the documentation I've found (sort of relating to this issue), they always show Windows examples, and they seem to work fine:
C://me/projects/json/my_json.json
Perhaps I'm completely missing the obvious, but is this possible on a Mac as well?
I've tried
file:///Users/me/projects/json/my_json.json
That doesn't work. I've tried some "resolve to path" syntax, but the HTTP service does not seem to allow for anything but file paths in quotes.
Would anyone be able to pint me in the right direction?
Use the File API. It's really easy, here's a quick code sample:
// Get a File reference, starting on the desktop.
// If you have a specific file you want to open you could do this:
// var file:File = File.desktopDirectory.resolvePath("myfile.json")
// Then skip directly to readFile()
var file:File = File.desktopDirectory;
// Add a listener for when the user selects a file
file.addEventListener(Event.SELECT, onSelect);
// Add a listener for when the user cancels selecting a file
file.addEventListener(Event.CANCEL, onCancel);
// This will restrict the file open dialog such that you
// can only open .json files
var filter:FileFilter = new FileFilter("JSON Files", "*.json");
// Open the file browse dialog
file.browseForOpen("Open a file", [filter]);
// Select event handler
private function onSelect(e:Event):void
{
// Remove listeners on e.currentTarget
// ...
// Cast to File
var selectedFile:File = e.currentTarget as File;
readFile(selectedFile);
}
private function onCancel(e:Event):void
{
// Remove listeners on e.currentTarget
// ...
}
private function readFile(file:File):void
{
// Read file
var fs:FileStream = new FileStream();
fs.open(selectedFile, FileMode.READ);
var contents:String = fs.readUTFBytes(selectedFile.size);
fs.close()
// Parse your JSON for display or whatever you need it for
parseJSON(contents);
}
You hinted at this in your post about examples being for Windows and you being on a Mac but I'll state it explicitly here: you should always use the File API because it is cross platform. This code will work equally well on Windows and Mac.