i'd like to ask a question that's been bothering me for some days now.
I have a mysql db with one table concisted of 38 cells.
Among those cells 2 of them are expecting greek text in them (FIRSTNAME and LASTNAME).
The table and each cell are formatted in utf8-general-ci
Cause the table will be filled with thousand of rows every month, i have a selected to populate it by incerting a csv file created in excel (csv file with commas).
The problem is that it won't allow the greek text to populate those 2 cells, returning the following answer:
1366 - Incorrect string value: '\xCC\xC1\xCC\xC1\xD3' for column 'lastname' at row 1
Does anyone have any suggestion on what to do????
In phpMyAdmin when you choose a table then Import you will se an option like "Charset file" (in greek version of phpMyAdmin it is "Σύνολο χαρακτήρων του αρχείου") you must select utf-8.
If it doesnt work then check your csv file if the greek characters is fine or not then you must check the exporting method.
Related
I have created a table from a csv file using built in mysql workbench wizard.
The structure was created and also 3 test rows were imported into the table.
Now I wanted to use the very same csv which i used to create the table in the first place and load the same 3 rows once again but using the LOAD DATA INFILE command.
I am getting error code: 1300. The string field contains german ü characters which I assume are the problem but why ? The column already contains string values with those characters.
screenshot
I have a CSV file that I exported myself out of SFDC. It has about 60k records. 3 columns are numbers, 1 is a date, & the other dozen or so are Text.
In SSMS or SSIS, when I attempt to import the file to a table - the importer errors out on the same row of data each time "15421" with the error message:
The data conversion for column XXXX returned status value 4 and status text "Text was truncated or one or more characters had no match in the target code page."
The error is pointing to one of the Text columns. When I look at the data in the table to see what imported, the data ends exactly that that row and column - the column is empty. The contents of the column is 2 characters.
My first attempt was to use DT_STR (255), resulting in the error. If I switch to as DT_WSTR (1024) or even DT_NTEXT - the job runs and reports success, but it ends exactly that row and doesn't import the rest of the 45k records - as if something in that row (and in that column?) is indicating the file is finished at that point.
I looked at that file with Notepad ++ and Sublime Text Gremlins, and Sublime Text Hex editor - I can't see anything abnormal in the data or the text qualifying quotation marks, or the comma delimiters... Thoughts? TIA!
A table created with multiple columns with 5 phone number columns, all of them has data type like
varchar(15)
But when I insert phone number of 10 digits without - or () like 9999999999 then in phonenumber3,phonenumber4,phonenumber5 columns alone get updated as 9999999999.0
I'm reading from CSV and writing into table using pandas to_sql().
Why is this happening?
If the CSV data is generated from server (unaltered by user), "zero" data is preserved in importing. Assuming " quotes as container. Everything should be fine.
If the CSV data is altered by the user such as, editing/saving... The "zero" data will be trimmed, as the software editor like excel or libre office will omit the "zero" during saving. Work-around with this is to edit the CSV file in Text Editor (e.g. Sublime) the columns should be preserved as "012345678" with the quotes as container.
Ok so I am trying to import data from a CSV file into mySQL database table. The table is called serial_code, however when I try to upload the CSV file I have an error message. I have tried taking the column names out, also adding NULL to the last column EngineSerialCode and I have also viewed the CSV in a text editor and it shows the columns correctly with , comma.
Invalid column count in CSV input on line 1.
Version of phpMyAdmin and mySQL
Database Fields
CSV Fields I want to import
Ok so I found the issue and it was quite frustrating. I had a serial code in my data that had a "," comma instead of a "-". Also I noticed that my CSV file EngineSerialCode was placed in column E and not D as I only have 4 columns in my database. From the image column C is overlapping column D hence my mistake.
I am trying to import a Text File into a MySQL through Navicat DB software.
I am struggling to import(append) a text file into a MySQL table.
The text file fields are seperated by | ;
example : |Name|Email|Address|
When i import this through the Navicat import wizard it ask for " Which delimeter seperates
the fields. So instead of selecting Tabs, ; , or any other i select | as field seperator.
But still the fields in the file do not match(sync) with the fields of the table...
Can anyone suggest any advice here?
I actually have exported the text file from another MySQL DB thru export functionality from PHPMyAdmin,,
I assume your name column is null and the values appear instead in the email column?
I suspect the problem lies in the fact that your fields are not only separated by a pipe, your rows also begin and end with a pipe.
Think of a CSV: name,email,address, not ,name,email,address,, because that would be interpreted as 5 columns, the value of the first and last field being null.
You'll have to choose a different delimiter for your rows and fields.
Beyond that, you can try importing the data into a new table and then write an insert query to map the temp fields to the ones in your database. The screen after the one where you choose the target table has a table where you can map your import fields to the target ones.
Let me know how that works out.