I am creating a script on the fly to ftp some files from a remote computer. I create a file which is then called from the command line with
ftp -s:filename proxy
where filename is the file I just created. The file has code similar to the following:
anonymous#ip address
username
prompt off
binary
cd c:\destination directory
mget c:\source directory\*.*
quit
That doesn't work. Neither does the following:
anonymous#ip address
username
prompt off
binary
cd c:\source directory
mput c:\destination directory
quit
Obviously, I'm not so good at ftp. How, in what order, where in my file do I specify the place where I want the files to be put (destination directory, and also from where the ftp process is running), and where I want the files to come from (ip address computer which has files I want). Do I need to set the directory before starting the ftp process?
I'm running this in an SSIS package, and I'm not using the SSIS ftp task, because I don't want a failure if no files are found. If there's nothing there, that's cool. If there is something there, I want a copy.
(It was working in my development area, and now, when I'm trying to get files from a server that I truely have no access to except ftp, I'm not getting anything. See How to avoid SSIS FTP task from failing when there are no files to download? for an earlier, related question.)
Update:
Both of the answers below, listing lcd and cd, are correct. However, my example still failed, until I replaced the backslashes with forward slashes. In other words, my final, working result is as follows:
anonymous#ip address
username
prompt off
binary
lcd /destination directory
cd /source directory
mget *.*
quit
Are you looking for LCD and CD where LCD changes directory on the local machine? EG:
LCD c:\destination directory
mget c:\source directory\*.*
In most ftp clients you can set the working directory on the server with the command cd, and you set the working directory on the client with the command lcd.
But it is not clear to me what you are trying to do.
Are you trying to move or copy files that are on the ftp server to another location on the ftp server? As far as I know you cannot do that with ftp. If you wished to copy files from one folder on the ftp server to another, then I believe you would get a copy to the local system, and then reupload them to a new folder. If you wish to move files you can use the rename command.
Related
I need a cgi-bin but I can't find one when I am connected on Filezilla or cPanel, it's just public_html and that's it.
What do I do? I have looked online and it says that I need to use a command line but I have no clue how I would do that!
The "cgi-bin" directory does not exist until you create it with your FTP program at the top level of your Web site directory. From the perspective of your FTP program, it's just a normal directory (folder) that you can create, but it's treated differently by the server because of its special name.
Open your Filezilla program and connect to your website.
When you are in your top-level directory, (Where you see "public_html")
Right-click anywhere in the the window and select "create directory."
Simply name the directory "cgi-bin"
You're done!
Kindly see this screen cast to get better idea about our requirement:
https://www.screenr.com/QmDN
We want to automate the Text Datasource Generation and connection to MS Excel in order to make it easier to the end-user to connect to the Text Datasource (CSV) to MS Excel so that they can generate their own reports.
The steps I have in mind:
Use WinSCP FTP Client with Scripting
Write script to get the most recent updated file from FTP Folder
Or instead of step 2, download all generated files from FTP to a Shared Folder on the Network.
Get the most recent version of the Generated CSV File
Rename the file to the Standard Naming Convention. This must be the name used in MS Excel as the CSV Text Datasource.
Delete all other files
I developed sample script that can be used by WinSCP to download the files from FTP folder:
# Automatically abort script on errors
option batch abort
# Disable overwrite confirmations that conflict with the previous
option confirm off
# Connect
open CSOD
# Change remote directory
cd /Reports/CAD
# Force binary mode transfer
option transfer binary
# Download file to the local directory d:\
#get "Training Attendance Data - Tarek_22_10_21_2014_05_05.CSV" "D:\MyData\Business\Talent Management System\Reports\WinCSP\"
get "*.CSV" "D:\MyData\Business\Talent Management System\Reports\WinCSP\Files\"
# Disconnect
close
exit
Then, I can schedule the above code to run periodically using this command:
winscp.com /script=example.txt
The above sample is working fine, but the main problem is how to identify the most recent file, so that I can rename it, and delete all the other files.
Appreciate your help.
Tarek
Just add the -latest switch to the get command:
get -latest "*.CSV" "D:\MyData\Business\Talent Management System\Reports\WinCSP\Files\"
For more details, see WinSCP article Downloading the most recent file.
You don't specify the language you use, here a Ruby script that downloads the most recent file of an FTP path. Just to demonstrate how easy and terse this can be done with a scripting language like Ruby.
require 'net/ftp'
Net::FTP.open('url of ftpsite') do |ftp|
ftp.login("username", "password")
path = "/private/transfer/*.*"
# file[55..-1] gives the filename part of the returned string
most_recent_file = ftp.list(path)[2..-1].sort_by {|file|ftp.mtime(file[55..-1])}.reverse.first[55..-1]
puts "downloading #{most_recent_file}"
ftp.getbinaryfile(most_recent_file, File.basename(most_recent_file))
puts "done"
end
We are using TortoiseHg as our Mercurial client UI. Today we ran into an issue while trying to push from one particular workstation. It was receiving the following error:
abort: No space left on device
[command returned code 255 ..........]
This error occurs while TortoiseHg/Mercurial is bundling files in preparation to pushing to the repository. I did some testing and noticed that the workstations (C:) drive was gradually being filled up as the file were being bundled. The (C:) drive went from ~900MB to ~100MB and then the error message was received. So this is obviously the cause.
My question is this:
Does anyone know which default directory is used to store the temp files created while TortoiseHg/Mercurial bundles files in prep for a push? This seems to be independent of the drive TortoiseHg is installed to. I re-installed to a data drive with plenty of space and still used (C:) to store whatever temp files it was using.
Is there a way to configure TortoiseHg/Mercurial to use a temp directory of your choice?
Thanks in advance for any help!
Mercurial is python and python has good platform specific defaults for temporary file locations. They're pretty easily overridden if you want something other than the defaults, which on Windows are probably c:\temp.
http://docs.python.org/library/tempfile.html#tempfile.tempdir says it's:
The directory named by the TMPDIR environment variable.
The directory named by the TEMP environment variable.
The directory named by the TMP environment variable.
A platform-specific location:
On RiscOS, the directory named by the Wimp$ScrapDir environment variable.
On Windows, the directories C:\TEMP, C:\TMP, \TEMP, and \TMP, in that order.
On all other platforms, the directories /tmp, /var/tmp, and /usr/tmp, in that order.
As a last resort, the current working directory.
So if you've got software using Mercurial on a client computer set the environment variable to some place you know has space.
Mercurial always stores internal files inside the ".hg" folder in the local repository folder.
Maybe TortoiseHg has a additional temp folder... don't know. Anyway you should try to push the files using the Mercurital command line client:
hg push
More information about the command line client you can find here Mercurial: The Definitive Guide
Another temporary solution might be the move these files via a file system simlink to another drive with more space left.
I have developed a download/upload manager script.
When I upload a file via POST method it is stored in a folder called files, the files folder is within another folder called download-manager.
Now it seems when I upload via the POST method 0666 CHMOD works when I want to rename, delete the file but the download-manager folder and the files folder need to be 0777 CHMOD for this to work. Now can someone tell me if this is dangerous?
1) I got a deny all in .htaccess so nobody can access the files directory via a browser
2) the upload script is protected by a username and password which the person who uses the script will obviously change, so only admins can basically upload, rename, edit, delete files and the records in the MySQL database.
When a file is uploaded a record is added to the database with information like file type, file name, file size etc and then the unique id (auto incremented by MySQL) is appended to the process.php file which gets the file from the directory and mime type etc that is not revealed, the process.php basically does the checks to see if record and files exists and if so forces the download of that file.
Basically the download URL is like: wwww.mydomain.com/process.php?file=57, a check is done to obviously make sure that id exists in the database and that a file exists with the file name stored in the database with that id.
Now all this works fine when uploading the file via a form using POST method but I also added a manual upload so for people who want to upload a file that is larger than the size their webhost allows they can simply upload the file via a FTP program for example and then just add the filename and file details manually themselves via a form in the admin area to link the record with the file. The problem is then a permission issue because if the file is uploaded via FTP or whatever way they upload the file by the php script cannot rename, delete the file if needed in the future as the php script does not have the correct privileges. So from what I gather, the only option is then telling the persons who use the script to change the file chmod to 0777 for it to work, i think that will make it work?
But then I have the problem of 0777 also being executable. The script allows any file type upload as it's a download/upload manager script but at the same time I am slightly confused with all this permissions lark and what I should actually be doing. As php is limited by the max upload size set by a host I want to add manual upload so users can upload the file by another method and assign the file to the database record but then as stated I get a problem when wanting to rename, delete the file via the php script.
I have developed the script to detect such problems and notify the user etc but I would like to try and make this script do all the leg work or nearly all of it without having to state in the manual that the admin will have to chmod the file to 0777 when they want the script to rename, delete the file, although I don't know if just chmodding the file to 0777 will actually allow the php script to the rename, delete it and so forth but also security is then a concern.
UPDATED
Ok thanks so chown the file before chmodding it on upload?
Do i just use chown() around the file and nothing else and that will make it owned by the server process and make it private? as i see you got
chown apache:apache '/path/to/files' ;
Do I need to add the apache:apache bit?
I did think of this as simpler solution, if a admin does a manual upload tell them they will have to rename/delete the file manually if needed in the future because the script won't have the correct permissions to do so, this would then make this a easy solution, as the manualupload script can just rename the db record to keep it linked to the file. That way no worries of file permission issues.
Simply put user changes file manually via ftp for example from myfile.zip to somefile.zip then they edit the db record for that file and change the filename to somefile.zip from the old filename myfile.zip, that way everything is linked still but no worries about permission issues. As I also have been reading that chown() does not always work or cannot be relied on for whatever reason.
1) i got a deny all in .htaccess so nobody can access the files directory via a browser
Store your files in a separate folder, away from the directory structure that houses your PHP files.
As far as the permissions on the directory are concerned, there are three ways to go about setting up the permissions on the folder:
Make it world-writable (chmod 0777 '/path/to/files/')
This is not recommended, as it has major security implications especially on a non-dedicated server; anyone who has an account or can tell a process on the server to write/delete to that folder will be able to change its contents.
Make it temporary (chmod 1777 '/path/to/files/')
This also carries a security concern, but less so than option 1 for the following reason: users cannot modify the directory--only the files they own.
Make it owned by the server process and make it private (chown apache:apache '/path/to/files' ; chmod 0700 '/path/to/files')
This is arguably the best solution.
Just relax & enjoy.
On many shared hostings it's the only possible solution anyway.
There is another option - to ask a user for ftp pass and use ftp for copying files from tmp, like wordpress does. But I think it's even less secure.
I have the need to copy the entire contents of a directory on a FTP location onto a shared networked location. FTP Task has you specify the exact file name (not a directory) and File System Task does not allow accessing a FTP location.
EDIT: I ended up writing a script task.
Nothing like reviving a really old thread... but there is a solution to this.
To copy the all files from a directory then specify your remote path to be /[directory name]/*
Or for just files and not directories /[directory name]/.
Or specific file types; /[directory name]/*.csv
I've had some similar issues with the FTP task before. In my case, the file names changed based on the date and some other criteria. I ended up using a Script Task to perform the FTP operation.
It looks like this is what you ended up doing as well. I'd be curious if anyone else can come up with a better way to use the FTP task. It's nice to have...but VERY limited.
When I need to do this sort of thing I use a batch file to call FTP on the command line and use the mget command. Then I call the batch from the DTS/DTSX package.