cannot load simple csv file into tableau public 9.3 - csv

I am trying to load the following simple csv file into tableau public 9.3:
customers,item1,item2,item3,item4
1,0,0,0,0
2,0,0,0,0
3,0,0,0,0
However, it doesn't read the file as separate columns, despite the field separator being Comma. Instead it treats the whole line as one column. Any help would be greatly appreciated :

If you change your locale settings to English US you will be able to load the file. You should also be able to work around this by creating a schema.ini file.

Go to Data > Manage fields > [Field] Options
You can also control imported CSV behavior post import both by splitting individual columns (which will remain split on update as well), or by the image below at the CSV level.

That doesn`t work for me. So I reopen the .csv file in Excel and save it again in .csv format with ',' as the delimeter.
After that my file looks like .csv with ';' delimeter and works with Tableau.

Related

Loading data from a UTF-16 Le (.txt) file to Azure SQL Db

We have a .txt file with encoding UTF-16 LE (discussed here, as well). We need to load this file into an Azure SQL database. We are first trying to convert this file to a csv format by using Text Import Wizard of Data Excel 365 wizard. But if we use the ^|^,^|^ as a custom delimiter, the first and last columns still end up with ^|^ value.
Question: What may be possible solutions/work arounds for converting this type of file to csv?
Remarks: This is a huge file (1GB) with about 150 columns. Following is just a sample for explaining the scenario in this post.
Sample of the txt file:
^|^Col0^|^,^|^Col1^|^,^|^Col2^|^,^|^Col3^|^,^|^Col4^|^,^|^Col5^|^,^|^Col6^|^,^|^Col7^|^
^|^1234^|^,^|^4600869848^|^,^|^6000.00^|^,^|^2021-12-20 10:16:19.3600000^|^,^|^False^|^,^|^^|^,^|^^|^,^|^2^|^
^|^5431^|^,^|^3425143451^|^,^|^30000.00^|^,^|^2021-12-13 10:27:44.9030000^|^,^|^False^|^,^|^^|^,^|^^|^,^|^2^|^
.....................
............................
After using the delimiter ^|^,^|^ in Excel text import wizard
Instead of mentioning the ^|^,^|^ as custom delimiter, you can mention comma as a delimiter, that will give you a result like below:
Then you can record a macro to replace the desired characters which is ^|^ after importing is done as mentioned in below link:
Create A Macro Code To Achieve Find And Replace Text In Excel

Importing PIPE delimited format txt into MySQL via PHPMyAdmin

I am importing some thousands lines of Data from a .txt file containing two columns and the format is as it follows:
A8041550408#=86^:|blablablablablablablablablablablablablablablablablablablabla1
blablablablablablablablablablablablablablablablablablablabla2
blablablablablablablablablablablablablablablablablablablabla3
A8041550408#=86^:|blablablablablablablablablablablablablablablablablablablabla1
blablablablablablablablablablablablablablablablablablablabla2
A8041550408#=86^:|blablablablablablablablablablablablablablablablablablablabla1
blablablablablablablablablablablablablablablablablablablabla2
blablablablablablablablablablablablablablablablablablablabla3
blablablablablablablablablablablablablablablablablablablabla4
etc....
What I have done so far is create a table with the two fields, but when i try to import the .txt file as a CSV and putting / Columns separated By : | /, I get an error:
"Invalid column count in CSV input on line 2."
Which is quite obvious since the second line of the .txt file is empty.
Moreover, I have tried importing the file as a CSV using LOAD DATA, and it didn't work as well it has just filled up the table with random words and phrases from the .txt file .
So my question is : How can I import the data from this file ?
You have to fix your file; in its current state you cannot expect the import module to be able to understand it. First step would be to remove the empty lines: How to remove blank lines from a Unix file

Importing CSV file in Talend - how to set options to match Excel

I have a CSV file that I can open in Excel 2012 and it comes in perfectly. When I try to setup the metadata for this CSV file in Talend the fields (columns) are not splitting the same was as Excel splits them. I suspect I am not properly setting the metadata.
The specific issue is that I have a column with string data in it which may contain commas within the string. For example suppose I have a CSV file with three columns: ID, Name and Age which looks like this:
ID,Name,Age
1,Ralph,34
2,Sue,14
3,"Smith, John", 42
When Excel reads this CSV file it looks at the second element of the third row ("Smith, John") as a single token and places it into a cell by itself.
In Talend it trys to break this same token into two since there is a comma within the token. Apparently Excel ignores all delimeters within a quoted string while Talend by default does not.
My question is how to I get Talend to behave the same as Excel?
if you use tfileinputdelimited component to read this csv file, you can use delimeter as "," and under csv options properties of this component you should enable Text Enclosure """ option or even if you use metadata there would be an option to define string/text enclosure - here you should mention """ to resolve your problem

How to import .txt to MySQL table

How do I import a .txt file into a MySQL table?
My .txt file is like this...
ex : AF0856427R1 000002200R HADISUMARNO GGKAMP MALANG WET 3 6 00705 AFAAADF16000-AD-FA P00.001.0 1 000001.00022947.70023290.00 T511060856425A 022014B
There are 39 fields in my file.
Try mysqlimport command
name of the text file should be the name of the table in which you want the data to be imported. For eg, if your file name is patient.txt, data will be imported into patient table
mysqlimport [options] db_name textfile
There are lot of options that you can pass in. Documentation here
Especially since some of your fields are terminated by spaces and some are based on string length, I would definitely first do some string manipulation with your favorite tool (sed, awk, and perl are all likely very good choices).
Create an intermediary comma separated file. If you have commas in the existing file, you can easily use some other character. The goal is to create a file that has one consistent separator.
You've used the phpMyAdmin tag, so from your table go to the Import tab, select the file, and pick CSV from the dropdown of file types. Edit the options according to what your file looks like (for instance, perhaps ยง is your column separator and you might leave the next two options blank). Then try the import and check the data to make sure it all arrived in the columns you expected.
Good luck.

Junk characters at the beginning of file obtained via column transformations in SSIS

I need to export varbinary data to file. But, when I do it using Column Transformations in SSIS, the exported files are corrupt. There are few junk characters at the start of the file. On removing them, the file opens fine.
A similar post for BCP, says that these characters specify the data length.
Would like to know how to address this issue in SSIS?
Thanks
Export transformation is used for converting the varbinary to files.I have tried something similar using Adventure works which has image type of var-binary data.
Following Query is used for the Source query. I have Modified the query
since it does not have the full path to write image files.
SELECT [ProductPhotoID]
,[ThumbNailPhoto]
,'D:\SSISTesting\ThumnailPhotos\'+[ThumbnailPhotoFileName]
,[LargePhoto]
,'D:\SSISTesting\LargePhotos\'+[LargePhotoFileName]
,[ModifiedDate]
FROM [Production].[ProductPhoto]
Used the Export column transformation[also available in 2005 and
2008] and configured as follows.
Mapped rest of the columns to the destination.
After running package all the image files are written into the
respective folders[D:\SSISTesting\ThumnailPhotos\ and D:\SSISTesting\LargePhotos].
Hope this helps!