SSIS package for export data into csv file to FTP - ssis

I'm creating SSIS package for to get .csv file to my local server and transfer it to FTP
When I get my csv into FTP and open into excel, My data getting shift over to other columns. Is there internally any kind set up do I need to change?
Also I tried different text qualifier still did not work.

It sounds like there may be hidden characters in your data set. If you are using comma's you may want to consider using a lesser used character for the delimiter such as a pipe "|". For instance an address may naturally have comma's. If a pipe shows up in an address field it's probably a type-o, and is far less likely. Things that shift data cells are often things like tab characters and CRLF. You can also open your data set in a text editor like notepad ++ and choose the "Show all Characters" option under "View->Show Symbols" menu option to see what the exact character is. If it's rampant in your data set you can use the replace function within the Derived Column Task to scrub the data as it comes out of the data source.

Related

cannot import unicode flat file in SSIS

I have a flat file with almost 300 columns that needs to be imported in SQL server. It seems that if I use SSIS it can't read the csv file if I mark it as unicode. It seems that it looses ability to recognize cr and lf.
The specified header or data row delimiter "{CR}{LF}" is not found
after scanning 524288 bytes of the file "...\contact.csv". Do you want
to continue scanning this file?
what am I doing wrong?
EDIT
Based on comments, it seems I need to clarify - yes I did check that {CR}{LF} is present at the end of each line, and that it's set as a row delimiter in the connector.
The problem is with the "unicode" checkbox. If I uncheck it, the file is read fine. If I check it, it doesn't find {CR}{LF} any more.
It also doesn't matter what encoding I set on the file as that only effects the default "code page" selection.
ok, after a while I found an answer.
The unicode checkbox is still not working, but if you can go to the advanced section of the flat file manager and set your string columns to unicode. It's kind of tedious, and I don't know what I would do if I had 200 columns, but for my small data set it worked.

Can I export from a query in access to a text file without wrapping strings in quotes

I'm trying to do an export from access into a text file via a query
select CustomerName
into [Text;FMT=TabDelimited;HDR=NO;DATABASE=C:\Temp\;].CustomerList.txt
from Customer
however, every line is getting wrapped in double quotes. Is there a way to turn off the quoting (I'm only ever setting one column), or can I use a custom quote character (e.g. set it to blank)?
Method 1
You have to add manually a schema.ini in the directory you wish to export
In your case, it should contain :
TextDelimiter="none"
Method 2
Another way to do it is to use the TransferText method, with :
SpecificationName Optional Variant. A string expression that's the
name of an import or export specification you've created and saved in
the current database. For a fixed-width text file, you must either
specify an argument or use a schema.ini file, which must be stored in
the same folder as the imported, linked, or exported text file. To
create a schema file, you can use the text import/export wizard to
create the file. For delimited text files and Microsoft Word mail
merge data files, you can leave this argument blank to select the
default import/export specifications.
for your export specification, which is a oneshot operation, you will use the wizard and there you have an "advanced" button bringing a menu where you can set the text delimiter to nothing.
Google is your friend. You've got enough clues now to sort it out.

SQL Server Export Unicode & Import via SSIS

(SQL Server 2008)
So here's my task ..
I need to export query results to file, and then import that file using SSIS to another DB.
Specific to the task, the data contains every awkward unicode character you can think of, so delimiting with commas, pipes etc is out of the question.
Here are the options SSMS gives me for export format:
Column Aligned
Comma/Tab/Space delimited
Custom delimiter
And here are the options SSIS gives me for a flat file data source:
Delimited (custom)
Fixed Width
Ragged Right
So given that a delimiter character is out of the question ... I cannot see another method that both SSMS & SSIS agree on.
Such as fixed width ?
Seems strange that the 2 closely related MS products have such different options.
Or have I missed something here ?
Any advice appreciated !!
It seems you need to try out different combination of options while creating delimited flat file(for your exported query result).
Try setting Code page to UTF-8 with and without Unicode. Also use Text qualifier as " or any of your choice which you thought might work. Also try using different option for column delimiter.
Once you are able to create delimited file then you have to apply same setting on file while importing to another DB.

Junk characters at the beginning of file obtained via column transformations in SSIS

I need to export varbinary data to file. But, when I do it using Column Transformations in SSIS, the exported files are corrupt. There are few junk characters at the start of the file. On removing them, the file opens fine.
A similar post for BCP, says that these characters specify the data length.
Would like to know how to address this issue in SSIS?
Thanks
Export transformation is used for converting the varbinary to files.I have tried something similar using Adventure works which has image type of var-binary data.
Following Query is used for the Source query. I have Modified the query
since it does not have the full path to write image files.
SELECT [ProductPhotoID]
,[ThumbNailPhoto]
,'D:\SSISTesting\ThumnailPhotos\'+[ThumbnailPhotoFileName]
,[LargePhoto]
,'D:\SSISTesting\LargePhotos\'+[LargePhotoFileName]
,[ModifiedDate]
FROM [Production].[ProductPhoto]
Used the Export column transformation[also available in 2005 and
2008] and configured as follows.
Mapped rest of the columns to the destination.
After running package all the image files are written into the
respective folders[D:\SSISTesting\ThumnailPhotos\ and D:\SSISTesting\LargePhotos].
Hope this helps!

Problems importing excel data into MySQL via CSV

I have 12 excel files, each one with lots of data organized in 2 fields (columns): id and text.
Each excel file uses a diferent language for the text field: spanish, italian, french, english, german, arabic, japanese, rusian, korean, chinese, japanese and portuguese.
The id field is a combination of letters and numbers.
I need to import every excel into a different MySQL table, so one table per language.
I'm trying to do it the following way:
- Save the excel as a CSV file
- Import that CSV in phpMyAdmin
The problem is that I'm getting all sorts of problems and I can't get to import them properly, probably because of codification issues.
For example, with the Arabic one, I set everything to UTF-8 (the database table field and the CSV file), but when I do the import, I get weird characters instead of the normal arabic ones (if I manually copy them, they show fine).
Other problems I'm getting are that some texts have commas, and since the CSV file uses also commas to separate fields, in texts that are imported are truncated whenever there's a comma.
Other problems are that, when saving as CSV, the characters get messed up (like the chinese one), and I can't find an option to tell excel what encoding I want to use in the CSV file.
Is there any "protocol" or "rule" that I can follow to make sure that I do it the right way? Something that works for each different language? I'm trying to pay attention to the character encoding, but even with that I still get weird stuff.
Maybe I should try a different method instead of CSV files?
Any advice would be much appreciated.
OK, how do I solved all my issues? FORGET ABOUT EXCEL!!!
I uploaded the excels to Googledocs spreadsheets, downloaded them as CSV, and all the characters were perfect.
Then I just imported into their corresponding fields of the tables, using a "utf_general_ci" collation, and now everything is uploaded perfectly in the database.
One standard thing to do in a CSV is to enclose fields containing commas with double quotes. So
ABC, johnny cant't come out, can he?, newfield
becomes
ABC, "johnny cant't come out, can he?", newfield
I believe Excel does this if you choose to save as file type CSV. A problem you'll have is that CSV is ANSI-only. I think you need to use the "Unicode Text" save-as option and live with the tab delimiters or convert them to commas. The Unicode text option also quotes comma-containing values. (checked using Excel 2007)
EDIT: Add specific directions
In Excel 2007 (the specifics may be different for other versions of Excel)
Choose "Save As"
In the "Save as type:" field, select "Unicode Text"
You'll get a Unicode file. UCS-2 Little Endian, specifically.