How to copy the contents of an FTP directory to a shared network path? - ssis

I have the need to copy the entire contents of a directory on a FTP location onto a shared networked location. FTP Task has you specify the exact file name (not a directory) and File System Task does not allow accessing a FTP location.
EDIT: I ended up writing a script task.

Nothing like reviving a really old thread... but there is a solution to this.
To copy the all files from a directory then specify your remote path to be /[directory name]/*
Or for just files and not directories /[directory name]/.
Or specific file types; /[directory name]/*.csv

I've had some similar issues with the FTP task before. In my case, the file names changed based on the date and some other criteria. I ended up using a Script Task to perform the FTP operation.
It looks like this is what you ended up doing as well. I'd be curious if anyone else can come up with a better way to use the FTP task. It's nice to have...but VERY limited.

When I need to do this sort of thing I use a batch file to call FTP on the command line and use the mget command. Then I call the batch from the DTS/DTSX package.

Related

SSIS 2012 File Paths

I am using SSIS 2012 to perform a couple basic tasks. The issue I am having occurs in both a Process Execute task and a Flat File connection. When developing the file is local, but I am using an expression to replace the file directory once deployed. I have the files, which are a CSV and a BAT file in the "Miscellaneous" folder.
I would like to be able to reference the relative path of the files rather than an explicit directory on my computer. This would also prevent other developers from having to stage the files locally before being able to even validate the package.
Try using Project Parameter as your folder path.. and use it in your expression Something like
#[$Project::FileLocation] +#[User::FileName]

How can I stop "jekyll build" from overwriting existing files in the output directory?

The source for my Jekyll-powered website lives in a git repo, but the website also needs to have a couple large static files that are too large to go under version control. Thus, they are not part of the Jekyll build pipeline.
I would like for these to simply live in an assets directory in the Jekyll destination (which is a server directory; note that I don't have have any control over the server here; all I can do is dump static files into a designated directory) that does not exist in the git repo. But, running jekyll build deletes everything in the output directory.
Is there a way to change Jekyll's behavior in this case? Or is there some other good way to handle this issue?
Not sure this addresses the specific case in the OP, but seeing as how I kept getting to this page when I finally found an answer here, I thought I'd add an answer to this question in case it helps others.
I have a git post-hook that builds my jekyll site in my webhost when I push to my host, but it was also deleting anything else that I had FTP'ed over. So now I've put anything I need to stick around in a directory (external/ in my case), and added the following to my _config.yml:
exclude: [external]
keep_files: [external]
and now files in external/ survive.
If you upload Jekyll's output directory via FTP to your server, you can use a FTP tool that lets you ignore folders.
For example, my own site is built with Jekyll, but hosted on my own webspace, so I'm uploading it via FTP.
I explained in this answer how I scripted the building and uploading process, so I can update my site with a single click.
In my case (Windows), I used WinSCP, a free command-line FTP client, for this.
If you're not on Windows, you need to use something else, but there are probably other FTP tools out there that are able to ignore folders.
To ignore your assets folder in WinSCP, you just need to put this line into the script file:
(the file which contains the actual WinSCP commands - read my other answer for more information)
option exclude "assets/"
Now you can upload your large assets folder on the server once, and it won't be overwritten/deleted when you later update your site via FTP.

Writing/Reading User-Defined settings in app.config file

I am trying to read and write into app.config file of user-settings. I found a snippet of a code for working with confige file. I finally got it compiling and running, but it absolutely seems to be doing nothing to the App.config file.
Here is the code:
Method MainForm1.Button1.Click(sender: System.Object; e: System.EventArgs);
var
config : System.Configuration.Configuration;
begin
config:=ConfigurationManager.OpenExeConfiguration(ConfigurationUserLevel.None);
config.AppSettings.Settings.Add('PreferenceToRemember','value1');
config.Save(ConfigurationSaveMode.Modified);
ConfigurationManager.RefreshSection('appSettings');
end;
It is compiling without any errors, but I don't know if it is doing anything.
Is there anything wrong with the code? I need to be able to write/read a section and write/read a key/value. Thanks in advance.
UPDATE: Instead of using ConfigurationManager, I simply use Properties.Settings.Default. However, I am having bit of a problem writing into it and reading back from it, although program complies with without any errors and the code seems simple.
How do you read and write to Properties.Settings.Default from within your code?
Maybe you're looking at the wrong file?
The app.config you have in your solution will be copied to YourProgramFile.exe.config in the bin/Debug or bin/Release folder. When running your program it will update this file, not the app.config file in your solution.
Then perhaps you also should check write permissions on your application folder. Normally (Win Vista, Win 7) the User executing an application does not have write permissions in the Program Files folder where your application should reside, so updating the .config will most probably fail due to the lack of write permissions. This is even more true for Linux/Unix systems.
You should try to separate the elements you need to write and write an additional config file in a user-specific folder. You can take the defaults from the normal application config for that initially and just update the user-specific config file, this way you are not hindered by file permissions and every user can update their settings specifically.

SSIS flat file location. Local drive vs shared folder on same server

I have a flat file stored locally on the same server where SSIS is running.
When choosing the location of my flat file in the flat file connection manager, I could use the local drive (d:\testfiles\flatfile.txt) or I could use the UNC path (\myserver\flatfileshare\flatfile.txt.)
Both are pointing at the same file and the package is succesful either way. Is there a performance reason for why I should choose one over the other?
More than a performance reason, if you choose the UNC path, you have a more flexible solution. In that case, if you change the SSIS package, so it runs on another server, the path to the file will be correct.
Specify your files like this:
\\server\sharename\path\file.txt
This will work in both places.
Referred to as a UNC path.
When I had XP, I loved this utility for generating them - clippath. You could right-click a file and it would copy the path to your clipboard. Magical.
Now I'm on Win7 x64 and it's not supported. Windows7 has a copy file path, but it seems to use the drive letter, which is not what we want, is it?
Looking up a file is a trivial location, I wouldn't worry about the diff.
I'd use some sort of package configuration to store the path for the file, it's a much more flexible solution.

Correct PHP file upload permissions

I have developed a download/upload manager script.
When I upload a file via POST method it is stored in a folder called files, the files folder is within another folder called download-manager.
Now it seems when I upload via the POST method 0666 CHMOD works when I want to rename, delete the file but the download-manager folder and the files folder need to be 0777 CHMOD for this to work. Now can someone tell me if this is dangerous?
1) I got a deny all in .htaccess so nobody can access the files directory via a browser
2) the upload script is protected by a username and password which the person who uses the script will obviously change, so only admins can basically upload, rename, edit, delete files and the records in the MySQL database.
When a file is uploaded a record is added to the database with information like file type, file name, file size etc and then the unique id (auto incremented by MySQL) is appended to the process.php file which gets the file from the directory and mime type etc that is not revealed, the process.php basically does the checks to see if record and files exists and if so forces the download of that file.
Basically the download URL is like: wwww.mydomain.com/process.php?file=57, a check is done to obviously make sure that id exists in the database and that a file exists with the file name stored in the database with that id.
Now all this works fine when uploading the file via a form using POST method but I also added a manual upload so for people who want to upload a file that is larger than the size their webhost allows they can simply upload the file via a FTP program for example and then just add the filename and file details manually themselves via a form in the admin area to link the record with the file. The problem is then a permission issue because if the file is uploaded via FTP or whatever way they upload the file by the php script cannot rename, delete the file if needed in the future as the php script does not have the correct privileges. So from what I gather, the only option is then telling the persons who use the script to change the file chmod to 0777 for it to work, i think that will make it work?
But then I have the problem of 0777 also being executable. The script allows any file type upload as it's a download/upload manager script but at the same time I am slightly confused with all this permissions lark and what I should actually be doing. As php is limited by the max upload size set by a host I want to add manual upload so users can upload the file by another method and assign the file to the database record but then as stated I get a problem when wanting to rename, delete the file via the php script.
I have developed the script to detect such problems and notify the user etc but I would like to try and make this script do all the leg work or nearly all of it without having to state in the manual that the admin will have to chmod the file to 0777 when they want the script to rename, delete the file, although I don't know if just chmodding the file to 0777 will actually allow the php script to the rename, delete it and so forth but also security is then a concern.
UPDATED
Ok thanks so chown the file before chmodding it on upload?
Do i just use chown() around the file and nothing else and that will make it owned by the server process and make it private? as i see you got
chown apache:apache '/path/to/files' ;
Do I need to add the apache:apache bit?
I did think of this as simpler solution, if a admin does a manual upload tell them they will have to rename/delete the file manually if needed in the future because the script won't have the correct permissions to do so, this would then make this a easy solution, as the manualupload script can just rename the db record to keep it linked to the file. That way no worries of file permission issues.
Simply put user changes file manually via ftp for example from myfile.zip to somefile.zip then they edit the db record for that file and change the filename to somefile.zip from the old filename myfile.zip, that way everything is linked still but no worries about permission issues. As I also have been reading that chown() does not always work or cannot be relied on for whatever reason.
1) i got a deny all in .htaccess so nobody can access the files directory via a browser
Store your files in a separate folder, away from the directory structure that houses your PHP files.
As far as the permissions on the directory are concerned, there are three ways to go about setting up the permissions on the folder:
Make it world-writable (chmod 0777 '/path/to/files/')
This is not recommended, as it has major security implications especially on a non-dedicated server; anyone who has an account or can tell a process on the server to write/delete to that folder will be able to change its contents.
Make it temporary (chmod 1777 '/path/to/files/')
This also carries a security concern, but less so than option 1 for the following reason: users cannot modify the directory--only the files they own.
Make it owned by the server process and make it private (chown apache:apache '/path/to/files' ; chmod 0700 '/path/to/files')
This is arguably the best solution.
Just relax & enjoy.
On many shared hostings it's the only possible solution anyway.
There is another option - to ask a user for ftp pass and use ftp for copying files from tmp, like wordpress does. But I think it's even less secure.