I have an SSIS package that needs to connect to an FTP server and only retrieve files that aren't already on my local machine (example below), process these files by inserting them into SQL server, archive the processed files, then delete (example below) only the processed files from the FTP server.
I have successfully done the ForEach Loop Container to insert the files into SQL and then archive them, but the two FTP tasks are not working.
Although I've searched for this, I'm not finding the specific FTP retrieve and delete, so if there is an article matching this, then please pass that on.
FTP Retrieve Example:
FTP Server: (file1.xml, file2.xml, file3.xml)<br>
Local Machine: (file1.xml)<br>
Need: (file2.xml, file3.xml)
FTP Delete Example:
FTP Server:(file1.xml, file2.xml, file3.xml, file4.xml)<br>
Processed:(file1.xml, file2.xml, file3.xml)<br>
Delete:(file1.xml, file2.xml, file3.xml)
i think you will find your answer in these below links:
http://www.sqlservercentral.com/Forums/Topic577774-148-1.aspx
http://www.cozyroc.com/script/get-ftp-file-list-task
Related
I have to import a 180Mb table on a shared (which means trouble I know) host and I'm getting the max_questions error. The limit is 75000 and they are not willing to higher the limit for even an hour. The number of queries as about 2000000.
I have been using bigdump to upload the gz file, but the limit is killing me.
Is there a way to split the table and upload it bit by bit or merge it somehow? I have been searching online, but solutions are mostly for people who can access the database settings.
I am assuming that you have phpmyadmin installed in your website directory,
other wise download latest phpmyadmin version from official website.
and unzip it in your website directory
If that's the case then perform below steps.
Access your website directory via sftp or ftp.
If you can not access your website directory via sftp or ftp then you can setup cron jobs for fetching files.
Find the config.inc.php file located in the phpmyadmin directory. In my case it is located here:
\public_html\phpmyadmin3.2.0.1\config.inc.php
Find the line with $cfg[‘UploadDir’] on it and update it to:
$cfg['UploadDir'] = 'upload';
Create a directory called ‘upload’ within the phpmyadmin directory.
**\public_html\phpmyadmin3.2.0.1\upload**
Then place the large sql file that you are trying to import into the new upload directory.
Now when you go onto the db import page within phpmyadmin console
you will notice a drop down present that wasn’t there before
it contains all of the sql files in the upload directory that you have just created.
You can now select this and begin the import.
Reference : http://daipratt.co.uk/importing-large-files-into-mysql-with-phpmyadmin/comment-page-4/
I have found the solution to load xml file data to mysql db table Here
My question is how I can load remote location xml file and dump it to mysql database table.
If you have any idea or have done something like this please help to find the solution.
Thank you
LOAD XML statement does not support retrieving files from a remote location. The file has to be either on the server, or on the client (LOCAL - if enabled in configuration). You need to write a script that saves the file either to the server or to the client from the remote location.
If you manage to map a remote shared directory to the local fileszstem, then you can import data that way.
Is there a simple way to do an automated backup of an entire website on a host like GoDaddy via the command-line?
So far, I know I need to backup all the files in my home directory recursively. I could possibly automated SFTP to connect and issue a get -R * command to get the full file dump, or just use SCP.
The other half of the puzzle is getting all of the tables available, mostly WordPress tables. My guess is that maybe there's a command-line command I could issue which dumps the database contents to a flat file, which I could then also pull via SFTP. If such a command exists, my plan is to use a combination of Telnet and EXPECT scripts to login to the GoDaddy site, issue some commands, then disconnect back to my local shell.
The end result should be that I have a folder with all of my server content in it, plus the flat file backup of the SQL database from the server. I know there are WordPress backup plugins, but they tend to provide a slew of ZIP files, when all I want is the raw data directly so I can put it in my private SVN server for backup and versioning.
So my question: how do I extract all of the databases on my GoDaddy server via the command-line to a file?
Thank you.
In the end, I found a working solution.
First, I used 2 separate expect scripts.
Telnet into the server, delete old backups, use mysqldump to extract all tables to a flat file via mysqldump -u db_owner -p --all-databases > output.sql, and create a massive tarball of everything. Logout.
Use SCP to pull the newly created tarball, extract it to a local SVN controlled working copy folder.
Use a second expect script to login to the server and delete the backup. Logout.
From there, I just manually svn add and svn commit as needed.
This is probably too simplistic a question but here goes.
I have a client that will drop xls files into a folder on our FTP site. I need to check if a file exists, I need to move it from the FTP folder to a folder on the server. Once the processing is done, I need to send another (but different) xls file back to a folder on the same FTP server.
I can see that there is an FTP task and I can connect to the FTP site, but I am unsure on how to specify where to send the file and also how to only select a file at a time.
I think if I just concentrate on the first part, I can work on getting the file back as a second step.
So the end result is to check the folder on the FTP site, if a file exists move it to the server.
The SSIS FTP task wraps the basic FTP syntax you would use if you were connecting to the FTP site interactively. Here's a review of basic FTP syntax.
So here's what you should be looking for when you're editing the FTP task. 1) The task needs to log into the FTP server, 2) it needs to know that it is performing a GET operation, 3) it needs to know the path and filename of the file it is supposed to retrieve from the FTP server, and 4) it needs to know where to drop the file on the local server.
So, in the FTP Task Editor, you want to go to the General tab and create an FTP connection. Then go to the File Transfer tab, and then set the "Operation" -> "Receive files", and fill in values for the Local Path and the Remote Path. (Or you can keep those paths in SSIS variables and have the task get them from there.)
The IsTransferAscii setting is False by default. This means it will assume it is transferring a binary file. Alternatively, if you tell it to treat it like an Ascii file, it will try to fix the line endings to account for the different combinations of carriage return and line feed characters used by various operating systems. You don't want that if you want to transfer the file verbatim, but you might want it if you're going back and forth between Windows and Linux or something.
You should also learn a little interactive FTP syntax. I often use this to figure out why SSIS is having a problem transferring files. Go to the command prompt and type "ftp". You can then type "?" to see a list of commands. Or just type "ftp yourservername", log in, and use cd and ls to walk around the directory structure and see what's there.
I'd like to install AdventureWorks2008 (I just install SQL Server 2008 R2 Express).
Each time I download the recommended version from CodePlex, all I get is a AdventureWorks2008.mdf file. Not only I cannot attach the file from SQL Server Management Studio, but I cannot copy/paste the file directly into the the database.
I've read in several places that I need to use AdventureWorks2008.msi, but I cannot find where to download it.
I just cannot figure out how to install AdventureWorks2008
Thanks for helping
There isn't an .msi file for adventureworks, even though you'll find it mentioned in outdated documentation and books. You aren't alone in finding this confusing -- it seems the web site, files and steps Microsoft provides for installing these databases changes every time I need to install them.
You need to create the database and attach the .mdf file, which is the "data file" referred to in the instructions. (.mdf = primary data file, .ldf = log file, .ndf = secondary data file)
In order to attach the file, you need to make sure you carefully follow the steps here: http://social.technet.microsoft.com/wiki/contents/articles/3735.sql-server-samples-readme-en-us.aspx#Readme_for_Adventure_Works_Sample_Databases
Instructions for 2008R2:
To install AdventureWorks2008R2 OLTP database
Download the AdventureWorks2008R2 Data File.
From File Download, click Save and browse to a location on your local
server.
From SQL Server Management Studio, execute the following code:
Case-insensitive Database
CREATE DATABASE AdventureWorks2008R2
ON (FILENAME = '{drive}:\{file path}\AdventureWorks2008R2_Data.mdf')
FOR ATTACH_REBUILD_LOG;
As an alternative to step 3, you can attach the database using the SQL
Server Management Studio user interface. For more detailed
information, see Attach a Database (SQL Server Management Studio).
Note: You must remove the log file from the list of files to attach.
This will cause the operation to rebuild the log.
Headache saving tip from Aaron Bertrand:
You should place the mdf file in your normal data folder - SQL Server
will already have the proper permissions. You can get this path using
SELECT TOP (1) physical_name FROM master.sys.database_files;
You can directly paste that file into your database directory. For more information you can refer http://tryingmicrosoft.com/error-while-attaching-a-database-to-sql-server-2008-r2/.