Importing CSV database with null character as empty strings - mysql

I have hundreds of csv files and each one have lots of null characters in it. It is like that because the some of the cells must be empty. But when I try to import this into MySQL workbench using import wizard i keep getting the same error: "Unhandled exception: line contains NULL byte".
What I would like to do was to:
a) be able to import these database without the error from above
b) converting all null cells as empty strings.
Since there are hundreds of csv files like this one, each around 300mb, replacing the characters before importing doesn't seems to be a quick viable option.
Is there a way to force MySQL Workbench to accept the files with the null character in it?
I have googled many answers, none of which seems to be applicable to this case.
Many thanks

Since MySQL Workbench version 8.0.16 (released on 04/25/2019) there has been an additional option for uploading .csv file -- "null and NULL word as SQL keyword".
When selecting this option as NO, the NULL expression without quotes in .csv file (,NULL, rather than ,"NULL",) will auto-fill empty if your field default value is empty.
I hope this answer could solve other people's similar problem :)

Related

SSIS data conversion (from unicode to ANSI) returned status value 4

I have the following problem:
I have an SSIS package that starts with a query executed at an Oracle DB and I would like to export a Fixed Width flat file with ANSI 1253 Code Page. I get an error:
The data conversion for column [column_name] returned status value 4
and status text "Text was truncated or one or more characters had no
match in the target code page"
The problem has to do with the second part of the message, as the width is ok. I tried to use Data Conversion from Toolbox but it didn't work (probably I didn't use it on the right way). I have only select privileges to the database so I cannot add any sql procedures to remove special characters at the query. Also the idea to load data to a staging table wouldn't be the best choice at my case. Does anyone has any idea on how to convert my data without getting this error?
Thanks a lot in advance
Load data using your Source from Oracle DB and keep the data types they are giving you.
add a derived column and cast your column.
(DT_STR,[Insert Length],1252) [columnName]
if the column is ntext you need to do 2 steps to get to string.
(DT_STR...) (DT_WSTR) Ntextcolumn

Import fails from CSV file into SQL Server 2012 table

I am trying to import a rather large (520k rows) .CSV file into a SQL Server 2012 table. The file uses a delimiter of ;.
Please do not edit my delimiter. It is ";" I know that may seem strange, but that is what they used. It is not just a semicolon.
I don't think the delimiter is the issue because I replaced it with a tab and it seemed to be okay. When I try importing the file, I get a text truncation error, but I set the column to 255 just to be sure it had plenty of room.
Even when I delete the row, the next row causes the error. I don't see any offending characters in the data, so I am at a loss as to what the issue is.
I ended up using the EOL Conversion and selected Windows format in Notepad++ and then created a script to import the data.

How do I import column with values like '661-' into MySql?

I have a csv file from a legacy dbase dbf file. The data contains a few columns which have number values with hyphens. Like this '661-'. I am trying to import the csv into MySql using 'Import External Data' in MySql Yog. The issue is, the columns that have values with hyphens are getting imported as decimals resulting in '-661.0000'.
This is odd as the column format in the csv (via MS Excel) is 'general' not 'number' and I am trying to import these values into varchar fields. Seems MySql is ignoring the settings.
Has anyone faced something like this or have any suggestions on how I can get the data in as a string not a decimal?
Thanks
ANSWER - sorry to be answering my own question but I did solve it with some group input.
The file needs to be saved with all fields (that you want to be treated as string) in quotes. MS Excel DOES NOT seem to have an option for this. Apache Open Office does. Open the file in AOO and save as text.csv. From there you can edit the filters and set all cells to be in quotes. Problem solved.

SSIS 2012 extracting bool from .csv failing to insert to db "returned status 2"

Hi all quick question for you.
I have an SSIS2012 package that is reading a flat file (.csv) and is loading it into a SQL Server database table. However, I am getting an error for one of the columns when loading the OLEDB Destination:
[Flat File Source [32]] Error: Data conversion failed. The data conversion for column "Active_Flag" returned status value 2 and status text "The value could not be converted because of a potential loss of data.".
I am wondering if this is because in the flat file (which is comma delimited), the values are literally spelled out "TRUE" or "FALSE". The advanced page on the flat file properties has it set to "DT_BOOL" which I thought was right. It was on DT_STRING originally, and that wasn't working either.
In the SQL server table the column is set up as a bit, and allows nulls. Is this because it is literally typed out TRUE/FALSE? What's the easiest way to fix this?
Thanks for any advice!
It actually turned out there was a blank space in front of "True"/"False" in the file. Was just bad data and I missed it. Fixing that solved my issue. Thank you though, I did try that and when that didn't work that's when I knew it was something else.

Lose data in random fields when importing from file into table using phpmyadmin

I have an access DB. I exported tables to xlsx. Then I saved as .ods using openOffice
because I found out that phpmyadmin-mysql no longer supports excel files. I have my mySQL database formated exactly as it should to accept the data. I import and everything seems fine except one little detail.
In some fields, the value is NULL instead of the value it should have according to the .ods file. Some rows show the same value for that field correctly, some show NULL.
Also, the "faulty" rows have some fields that show the value 0 for fields that where empty in the imported file (instead of NULL). Default value for those fields in mySQL is NULL. Each row has many fields like that and all of the same data type (tinyint). Some appear correctly NULL and some have the value 0....
I can't see a pattern on all these.
Any help is appreciated.
Check to see that imported strings have ("") quotes and NULL do not and that all are separated appropriately, usually a "," comma with the record/row delimited by ";" semicolon. Best way to check what the MySQL is looking for is to export some existing data to the same format and check it against what you are trying to import. One little missed quote and the deal is off. Be consistent in the use of either double " quotes or single ' quotes. also the ` character is not used as I think. If you are "squishing" your data through an application that applies "smart quotes" like MS word does or "Open Office??' this too can cause issues. Add the word NULL either inside or without quotes in your csv import where values appropriate.