I am working on a page where one can view the contents of one (or more) text-files. I would also like a download button which allows for a download of those files. I would like to stick these files in an archive before downloading.
What I would like to avoid is having a copy of both the files & the archive on my server (update one but not the other .. etc).
How should I go about this?
Should I keep both the files & the archive anyway
Do I create the archive on the fly (i.e. when the user hits the download button)
Do I extract the contents of the archive each time the page is loaded
Are there any libraries that you know of that already do this sort of thing (I have searched but didn't come up with anything useful)
A backend strategy that somehow prevents a discrepancy between files & archive
....
Thanks
You could use PHP as backend(Some server side language is needed to create zips on the fly) and create zips based on user requests.
$files_to_zip = array(
'preload-images/1.jpg',
'preload-images/2.jpg',
'preload-images/5.jpg',
'kwicks/ringo.gif',
'rod.jpg',
'reddit.gif'
);
//if true, good; if false, zip creation failed
$result = create_zip($files_to_zip,'my-archive.zip');
Further information http://davidwalsh.name/create-zip-php
Related
The problem
In PhpStorm I have a style.css- and a app.js-file that I have to upload to a server over and over again. I'm trying to automate it.
They're compiled by Webpack, so they are generated/compiled. Which means that I can't simply use the 'Tools' >> 'Deployment' >> 'Upload to...' (since that file isn't and won't every be open).
What I currently do
At the moment, every time I want to see the changed I've done, then I do this (for each file):
Navigate to the files in the file-tree (using the mouse)
Select it
The I've set up a shortcut for Main menu >> Tools >> Deployment >> Upload to..., where-after I select the server I want to upload to.
I do this approximately 100+ times per day.
The ideal solution
The ideal solution would be, that if I pressed a shortcut like CMD + Option + Shift + G
That it then uploaded a selection of files (a scope?) to a predefined remote server.
Solution attempts
Open and upload.
Changing to those files (using CMD + p) and then uploading them (once they're open). But the files are generated, which means that it takes PhpStorm a couple of seconds to render the content (which is necessary before I can do anything with the file) - so that's not faster.
Macro.
Recording a macro, uploading the two files, looking like this:
If I go to the menu and trigger the Macro, then it works. So far so good.
But if I assign a shortcut key and trigger that shortcut while in a file, then it shows me this:
And if I press '1' (for it to upload to number 1 on the list), then it uploads the file that I'm currently in(!?), and not the two files from my macro.
I've tried several different shortcuts (to rule out some kind of keyboard-shortcut-clash):
CMD + Option + CTRL + 0
CMD + Shift 0
CMD + ;
... Same result.
And the PhpStorm Macro's doesn't seem to give me that many options anyways.
Keyboard Maestro.
I've tried doing it using Keyboard Maestro.
But I can't get it setup right. Because if it can't find the folders (if they're off-screen or if I'm in a different project and forgot to adjust they shortcuts), then it blasts through the rest of the recorded actions, resulting in chaos. Ideally it should stop, if it can't find the file on the screen.
Update1 - External program
Even if it's not possible to do in PhpStorm, - are there then another program that I could achieve this with?
Update2 - Automatic Deployment in PhpStorm
I've previously used this, - but I've had happen a few times that I started sync'ing waaaay to many files, overwriting critical core files. It seems smart, but can possibly tear down walls if I've forgotten to define an ignore properly.
I wish there was an 'Automatic Deployment for theses files'-function.
Update3 - File Watchers
I looked into file-watchers ( recommendation from #LazyOne ). Based on this forum thread, then file watchers cannot be used to upload files.
It is possible to accomplish it using external program scp (Secure Copy Protocol):
Steps:
1. Create a Scope (for compiled files app.js and style.css)
2. Create a Custom File Watcher with scp over that Scope
Start with Scope:
Create a Local Scope with name scp files for your compiled files directory (I will assume that your webpack compiles into dist directory):
Then, to add dist directory into Scope, select that folder and click on Include Recursively. Apply and Move to File Watchers
Create a custom template for File Watcher:
Choose a Name
Choose File type as Any
Choose Scope as scp files(created earlier)
Choose Program as scp
Choose Arguments as $FileName$ REMOTE_USER#REMOTE_HOST:/REMOTE_DIR_PATH/$FileName$
Choose Working directory as $FileDir$
That's it, basically what we have done is every time when a file in that scope changes, that file is copied with scp to the remote server to the corresponding path.
Voila. Apply Everything and recompile your project and you will see that everything is uploaded to the server.
(I assumed that you have already set up your ssh client; Generated public/private keys; Added a public key in your remote server; And, know ssh credentials to connect to your remote server)
I figured this out myself. I posted the answer here.
The two questions are kind of similar but not identical.
This way I found is also not the best, since it stores the server password in clean text. So I'll leave the question open, in case someone can come up with a better way to achieve this.
I opened my project on another computer, and the files where I'd been using a file watcher were expanded, like before they used to be nested like home.scss is now after I run the watcher once on that file.
Is there a way to automatically make all the files be nested?
Because when adding new files and folder with git, it would be quite troublesome to go into each and every file in order to make them become nested.
Like I have some minified JavaScript files that used to be nested, but now is expanded for some reason.
Hope you understand. Thank you.
Edit: Nested***
Is there a way to automatically make all the files go under a caret like that?
Unfortunately not. Such nesting information (to "go under a caret" as you are saying) is taken from "Output path to refresh" field of the corresponding File Watcher.
You have to run file watcher for such files at least once in order to see files nested like you have it on your another computer.
Here is how you can run File Watchers manually without the need to modify those files (so no extra history will appear in your git (or whatever VCS you may be using there)).
https://stackoverflow.com/a/20012655/783119
P.S.
In PhpStorm 2016.3 (the next version that will be released in 1.5-2 months or so) such nesting will be done automatically (the most common combinations) so there will be no need to have File Watchers for providing such info.
If you wish -- you can try EAP build right now (EAP means Early Access Program .. which is sort of Alpha/Beta builds (simply speaking).. and therefore some bugs for new functionality might be present and performance may not be optimal).
I'm exporting an old release 1.9.2 and importing to 3.0.2.
Each module has dozens of videos that I play via an URL/link, which points to my own .PHP program on the site (and that wraps a Camtasia video).
I found that in 3.0.2, the link opens on a separate page, unless I edit each link by going to "Appearance", then "Display" and setting it to "Embed".
So I would like to write a mySQL update script to automatically set this flag for all such links (I will add a where clause to my script name).
I checked database in PHP/MyAdmin, and didn't see any likely table names.
You should attempt to use the admin tools to update everything, go to your moodle installation's main URL.
Then go to the site administration. After /admin in the url, add /tool/replace and go there.
You can there enter what you want to find in the db, and replace it with another value. Just be carefull with this tool and make a backup before you begin.
I'm considering using Google Drive push notification in order to replace our currently pulling process.
I started playing with it, but I have 2 major problems:
Watching changes:
When watching for drive changes, I get notification with the new change id. But when I try to query it using: driveService.changes().get(changeId), I intermittently get 404. Am I doing something wrong here?
Watching files:
When watching for file changes, in case of a folder, I want to know about new files added to that folder, so I expected that when adding/removing files from this folder, the "x-goog-resource-state" will hold "add/remove" value while "x-goog-changed" will contain "children".
In reality, the "x-goog-changed" does contain "children", but the "x-goog-resource-state" is always "update", and there is no extra information about the added/deleted file.
Regarding deleted files, I know can get it by watching the file once I have it, but is there a way I can get updated about new files in a certain folder?
I was working on a similar project a few months ago. There are two things you can do to monitor changes on Google Drive :
Set Notification Push using : changes().watch()
Set Notification Push using : files().watch()
The 1st case sends you a request for everything that happens on the Drive you are monitoring, with very little information on what exactly has changed.
The 2nd case is less 'spamming', and you get to decide which folder to monitor.
However the tags on the change type are not accurate. when I was using files().watch() I tested all the use-cases, and I compared the headers of each case.
My conclusions are:
for a new file (or folder) creation inside yourfolder (yourfolder/newfile) the headers contain:
'X-Goog-Changed': 'properties'
'X-Goog-Resource-State': 'update'
which is the same when you move a file to yourfolder, or when you start following an existing file in your folder.
you get 'X-Goog-Resource-State': 'add' when you share with a user
as you can see, the header tags are not accurate/unique.
Also, note that the push-notification channel will not send you requests for files inside a folder inside yourfolder (yourfolder/folder/files). And the channel will expire at some point.
If you still have any questions, or want to know how to implement the code, let me know : )
I dont know if this is possible, but i have an SWF file which i want it to get info from an xml file only one time and then store it(keep it) there until a "newer" data will push into it
I know it sounds stupid but maybe there is a solution i don't know of...
1) i don't want to load the xml every time because of loads of traffic we have(a lot...will cost a lot to refresh everytime from amazon s3)
but how do we get a newer data without checking the external xml file?
2) if there was a way to broadcast(to "ping") to the swf that an update of the xml is ready to load....
if anything i believe there should be an AS3 script for that.
thanks!
but how do we get a newer data without checking the external xml file?
You could implement data file versioning. For example: urlToXmlData + "?version=" + myCurrentVesion As soon you ship new version of the application with extended data, it will work. It's global solution, so every new version of your product will work with latest data.
As for ping, you could create update strategy, for example: link is valid for 1 day. Idea is the same, concatenate stuff to the link, so browser will evaluate it as new: urlToXmlData + "?stamp=" + timestampForToday Tomorrow will be another timestamp, and browser will download updated version.
Use second xml which will contain the version first xml
I think you don't get the real subject.
You can't update a flash project without using any external update platform.
Because you cannot save all new XML data to a shared object because of "limitation of mb per data", so, flash going to delete all loaded data when application closed.
You cannot make a "permament update" in flash for "big files"...
BUT
You can update your main swf file with "EMBEDDED XML" files. Only that keeps your data updated when application is closed.
BUT that's not enough alone, also you need a "PRE-CHECK for Version of SWF" file, and that's what you can't do alone. You need a web platform featuring that way.
Mochi's "Live Updates" was offering that, but i couldn't make it work. But mochi is shutdown now... So forget about it...
I understand that your XML is BIG and you want to save trafic.
But how about small calls ?
You can have a call just to get the MD5 of your xml which is stored on the server.
Something like: myserver.com/getMD5
Once the server will return you a different md5 than the one you have already stored, then you reload xml and save new MD5.