AIR/AS3: Upload portion of a File without creating a new new File chunk - actionscript-3

I'm currently building an AIR file uploader designed to handle multiple and very large files. I've played around with different methods of breaking up file into chunks(100mb) and progressively uploading each so that I can guard agains a failed upload/disconnection etc.
I have managed to break up the file in smaller files which I then write to a scratch area on the disc however I'm finding that the actual process of writing the file is quite slow and chews up a lot of processing power. My UI basically grinds to a halt when its writing. not to mention that I'm effectively doubling the local disc space of every file.
The other method I used was to read into the original file in 100mb chunks and store that data in a byteArray which I can then upload as a POST data using the URLLoader class. Problem is that this way I cant keep track of the upload progress because the ProgressEvent.PROGRESS does not work properly for POST requests.
What I would like to know is if it's possible to read into the file in my 100mb chunks and upload that data without having to create a new file but still using the FileReference.upload() method in order to listen to all the available events that method gives me. Ie. create a File() that is made up of bytes 0 - 100mb of the original file, then call upload() on that new File.
I can post my code for both methods if that helps.
Cheers, much appreciated

I had such problem, but we were solve it in another way, we decided to write an socket connector, which will connect to server (e.g. FTP/HTTP) and write down to socket this ByteArray, and we did it also in chunks around the same size, and the biggest file we had to upload was BlueRay movie around ~150GB.
So I hope you got some interesting ideas from my message, If you'd like it, I could share some piece of code for you.

Related

Firebase: Exporting JSON Unable to export The size of data exported at a single location cannot exceed 256 MB

I used to download a node of firebase real-time database every day to monitor some outputs by exporting the .JSON file for that node. The JSON file itself is about 8MB.
Recently, I started receiving an error:
"Exporting JSON Unable to export The size of data exported at a single location cannot exceed 256 MB.Navigate to a smaller part of the database or use backups. Read more about limits"
Can someone please explain why I keep getting this error, since the JSON file I exported just yesterday was only 8.1 MB large.
I probably solved it! I disabled CORS addon in Chrome and suddenly it worked to export :)
To get rid of this, you can use Postman's Import feature because downloading a large JSON file sometimes faces failure in the middle of the way using a browser from the dashboard of the firebase. You can put the traditional cUrl commands on it. You just need to click save the response after the response is reached. To get rid of complex authentication complexity, you make the rule permission of the firebase database to read:true until the download is complete thought you need to ensure security for this. Postman also needs sometimes to preview the JSON even freezing the UI but you don't need to be bothered with it.

Creating large pdf file with Adobe Air

I'm trying to create a pdf file using alivepdf lib with about 4000 pages and each page contains an image added by the alivepdf method addImage(). the problem is that the data takes a lot of memory because alivepdf creates the whole file in the memory before saving it to desk. so i am asking if there is a away i can open the file with each entry and add page then close it and free memory before adding the next page.
thanks
I suggest you first use your app make many much smaller PDFs (for example each one is 400 pages, then next 400 as a new PDF, so on until all 4000 pages are done in 10 PDFs).
When those are ready, use an external tool like PDFtk to process those files you saved to disk. It works from command prompt but you can tell AIR to run it as a NativeProcess. It accepts instructions to combine those saved files and output them as a new big one.
Don't worry about memory, nothing is loaded into Flash. Flash itself can start the process but the merging will happen outside your app. Just wait for a big file to magically appear.

Actionscript is there a way to upload file straight into memory?

I'm looking into uploading an XML file and then storing it's contents in database. It looks like upload method of flash.net.FileReference would do the job however it just gives you an option to upload it to server.
I could upload it to server, read it from that server and then delete that file but I would like to avoid extra work.
Is there a way to just load a file into memory without saving it on some remote location?
No this cannot be done, uploads can only be done to a server, probably for security reason.
If you need to store the content to a database anyway, why don't you make the server-side bakend handle it?
If this is just some data that you need then throw away after the program is complete, perhaps you could consider asking the user to copy and paste their data to some textfield. That might depend on your target audience thought: IT-types - no problem, non-IT types-problem :D
If you are trying to have the user select an XML file from their local machine, after your myFileReference.load(), in your Event.COMPLETE handler function you can use var myXML:XML = XML(myFileReference.data); to get the data of the file you selected.
yes you can load all content in to cache, just push it into an array, when ever you want it just call it out.

Saving several images (and metadata for each) in a single file using Adobe Air

Is it possible, via Adobe Air, to save multiple types of data in a single file? For example, an application would allow the user to load in external images, position them on stage and label them. This data would be then be stored in a ByteArray (I guess) using BitmapData for the images and probably XML for the metadata.
I would then like to write this to a single file, with a bespoke file extension that could be associated with said Air app.
I've asked this on various forums and never received a single reply.
You can add everything to a byte array and write it to file - but defining boundaries and extracting individual entities back from the file would take some effort. How about writing them to normal files, zipping them to a single file and deleting the originals? This way you can still have a single file and deal with the individual items more easily.
This article describes some ActionScript zip libraries. I've used nochump in the past and it was easy - this page has some sample code
If you want some individuality for your files, you can rename the zipped file to whatever extension you want - that's what Firefox extensions do, they have .xpi extension, but they're plain zip files renamed.

How can I add file locations to a database after they are uploaded using a Perl CGI script?

I have a CGI program I have written using Perl. One of its functions is to upload pics to the server.
All of it is working well, including adding all kinds of info to a MySQL db. My question is: How can I get the uploaded pic files location and names added to the db?
I would rather that instead of changing the script to actually upload the pics to the db. I have heard horror stories of uploading binary files to databases.
Since I am new to all of this, I am at a loss. Have tried doing some research and web searches for 3 weeks now with no luck. Any suggestions or answers would be greatly appreciated. I would really hate to have to manually add all the locations/names to the db.
I am using: a Perl CGI script, MySQL db, Linux server and the files are being uploaded to the server. I AM NOT looking to add the actual files to the db. Just their location(s).
It sounds like you have your method complete where you take the upload, make it a string and toss it unto mysql similar to reading file in as a string. However since your given a filehandle versus a filename to read by CGI. You are wondering where that file actually is.
If your using CGI.pm, the upload, uploadInfo, the param for the upload, and upload private files will help you deal with the upload file sources. Where they are stashed after the remote client and the CGI are done isn't permanent usually and a minimum is volatile.
You've got a bunch of uploaded files that need to be added to the db? Should be trivial to dash off a one-off script to loop through all the files and insert the details into the DB. If they're all in one spot, then a simple opendir()/readdir() type loop would catch them all, otherwise you can make a list of file paths to loop over and loop over that.
If you've talking about recording new uploads in the server, then it would be something along these lines:
user uploads file to server
script extracts any wanted/needed info from the file (name, size, mime-type, checksums, etc...)
start database transaction
insert file info into database
retrieve ID of new record
move uploaded file to final resting place, using the ID as its filename
if everything goes file, commit the transaction
Using the ID as the filename solves the worries of filename collisions and new uploads overwriting previous ones. And if you store the uploads somewhere outside of the site's webroot, then the only access to the files will be via your scripts, providing you with complete control over downloads.