MYSQL load data timestamp format - mysql

I am trying to load data into a MySQL table and all of my timestamps (datetime field type) are being loaded as null. An example value I am trying to load is "2002-04-26 19:31:15.200000000" and I have specified the format as '%Y-%c-%d %H:%i:%s.%f' (statement below).
Since null is loading I am making an assumption that I have the format incorrect and that is causing the null to be loaded. What is the correct format that I should be specifying?
set DateAdded = STR_TO_DATE(#DateAdded,'%Y-%c-%d %H:%i:%s.%f')
The complete LOAD INFILE statement:
LOAD DATA INFILE 'CANDIDATES.txt' INTO TABLE dbo.CANDIDATE
FIELDS TERMINATED BY '|'
ignore 1 lines
(ResumeKey,Address1,Address2,City,Candidate_Type,Candidate_Type_Changed_UserID,S‌​tate,Country,#DateAdded,#DateModified,Degree,Email,FirstName,GPA,GradYear,HomePho‌​ne,JobTitle,LastName,Locale,Major,MiddleName,OrgName,OtherPhone,SchoolName,Zip,Ac‌​tive_Flag,SecureCandidate_Flag,CandidateStackingField,CellPhone,Homepage,FaxNumbe‌​r,BRUID)
set
DateAdded = STR_TO_DATE(#DateAdded,'%Y-%c-%d %H:%i:%s.%f'),
DateModified = STR_TO_DATE(#DateModified,'%Y-%c-%e %H:%i:%s.%f');

The issue was that the file that was being loaded was not in utf-8 format. MySQL can't load uff-16 files.
To convert the file on linux you would do this:iconv -f utf-16le -t utf-8 (file in uff-15le format) > (name of new uff-8 file)
if you need to find the current encoding of a file you can use file -ib filename.
After converting the file it loaded without issue.

Related

In MYSQL, how to upload a csv file that contains a date in the format of '1/1/2020' properly into a DATE data type format (standard YYYY-MM-DD)

I have a column of data, let's call it bank_date, that I receive from an external vendor as a csv file every day. As such the dates in that column show as '1/1/2020'.
I am trying to upload that raw csv file directly to SQL daily. We used to store the SQL bank_date format as text, but we have converted it to a Data data type, and now it keeps zero'ing out every time, with some sort of truncate / "datetime value incorrect" error.
I have now tested 17 different versions of utilizing STR_TO_date (mostly), CAST, and CONVERT, and feel like I'm close, but I'm not quite getting the syntax right.
Also for reference, I did find 2 other workarounds that are successful, but my boss specifically wants it uploaded and converted directly through the import process (not manipulating the raw csv data) for safety reasons. For reference:
Workaround 1: Convert csv date column to the YYYY-MM-DD format and save file. The issue with this is that if you try to open that CSV file again, it auto-changes the date format back to the standard mm/dd/yyyy. If someone doesn't know to watch out for this and is re-opening the csv file to double check something, they're gonna find an error when they upload, and the problem is not easy to identify.
Workaround 2:Create an extra dummy_date column in the table that is formatted as a text data type and upload as normal. Then copy and paste the data into the correct bank_date column using a str_to_date function as follows: UPDATE dummy_date SET bank_date = STR_TO_DATE(dummy_date, ‘%c/%e/%Y’); The issue with this is that it just creates extra unnecessary data that can be confused when other people may not know that 1 of the columns is not intended for querying.
Here is my current code:
USE database_name;
LOAD DATA LOCAL INFILE 'C:/Users/Shelly/Desktop/Date Import.csv'
INTO TABLE bank_table
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
IGNORE 1 ROWS
(bank_date, bank_amount)
SET bank_date = str_to_date(bank_date,'%Y-%m-%d');
The "SET" line is what I cannot work out on syntax to convert a csv's 1/5/2020' to SQL's 2020-1-5 format. Every test I've made either produces 0000-00-00 or nulls the column cells. I'm thinking maybe I need to tell SQL how to understand the csv's format in order for it to know how to convert it. Newbie here and stuck.
You need to specify a format for a date that is in the file, not a "required" one:
SET bank_date = str_to_date(bank_date,'%c/%e/%Y');

MySql naming the automatically downloaded CSV file using first and last date

MySql query gives me data from the 2020-09-21 to 2022-11-02. I want to save the file as FieldData_20200921_20221102.csv.
Mysql query
SELECT 'datetime','sensor_1','sensor_2'
UNION ALL
SELECT datetime,sensor_1,sensor_2
FROM `field_schema`.`sensor_table`
INTO OUTFILE "FieldData.csv"
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
;
Present output file:
Presently I named the file as FieldData.csv and it is accordingly giving me the same. But I want the query to automatically append the first and last dates to this file, so, it helps me know the duration of data without having to open it.
Expected output file
FieldData_20200921_20221102.csv.
MySQL's SELECT ... INTO OUTFILE syntax accepts only a fixed string literal for the filename, not a variable or an expression.
To make a custom filename, you would have to format the filename yourself and then write dynamic SQL so the filename could be a string literal. But to do that, you first would have to know the minimum and maximum date values in the data set you are dumping.
I hardly ever use SELECT ... INTO OUTFILE, because it can only create the outfile on the database server. I usually want the file to be saved on the server where my client application is running, and the database server's filesystem is not accessible to the application.
Both the file naming problem and the filesystem access problem are better solved by avoiding the SELECT ... INTO OUTFILE feature, and instead writing to a CSV file using application code. Then you can name the file whatever you want.

Change the date format in a csv .txt file before importing using mysqlbulkloader

For anybody looking for an answer in the future, the best way to do this is using the load data local infile option, where you will need to call out your particular columns and designate the columns you want to preprocess with an # symbol. like the example below. You may also need to update both your client and server side settings to allow the load data option to work.
in my code example below i wanted to change the createdate field from the m/d/y format to the mysql format of yyyy-mm-dd
LOAD DATA local INFILE 'C:/Users/username/Documents/SAP/SAP GUI/QNs.txt'
IGNORE
INTO TABLE tbl_quality_notification
FIELDS TERMINATED BY '|'
LINES starting by '|'
TERMINATED BY '\r\n'
IGNORE 6 ROWS
(#CreateDate,
NotificationNumber,
ItemNumber,
WorkOrder,
MaterialNumber,
MaterialDescription,
Disposition,
DeptResponsible,
OperationCaused,
DefQtyInt,
DefQtyExt,
TotalQty,
WcOpFoundAt,
OpDescription,
DefectLocation,
ProblemType,
ProblemDescription,
CauseCode,
CauseDescription,
RootCause,
CorrectiveAction)
SET CreateDate = STR_TO_DATE(#CreateDate, '%m/%d/%Y')
Thanks GMB, load local infile seems to be the right solution. i will post a new thread to address the invalid character string issues.

date field values are wrong while importing csv into mysql

I am importing csv file into mysql using load data. My load data command is as mentioned below.
load data local infile 'D:/mydata.csv' into table mydb.mydata
fields terminated by ','
enclosed by '"'
lines terminated by '\r\n'
ignore 1 lines
(SrNo,SourceFrom,#var_updated,Title,First_Name,Middle_Name,Last_Name,Designation,Company_Name,#var_dob,Office_Mobile_No)
set updated = str_to_date(#var_updated,'%Y-%m-%d'), dob = str_to_date(#var_dob, '%Y-%m-%d');
I am getting different values in my "Updated" and "DOB" columns. Such values are different in my .csv file.
First image is from mysql workbench while another is of csv.
Also, I sat "office_mobile_no" column's format to 'number' in csv. But its showing number like this.
When I double click on it, then only it shows the real number like 9875461234. It imports the same in mysql too. How do I get original number in a specific column? Also why my imported date values are differ from csv's date columns?
A couple of points that I can see:
It looks from your screenshot like the data in your CSV file for "updated" is in d-m-Y format, but you're telling the import to look for Y-m-d. I think you need to change
set updated = str_to_date(#var_updated,'%Y-%m-%d')
to
set updated = str_to_date(#var_updated,'%d-%m-%Y')
And the same for DOB field as well, assuming your CSV has that in the same format.
You said I sat "office_mobile_no" column's format to 'number' in csv. CSV is a text file format, it doesn't store any information about how to display data. What you're seeing is just how Excel decides to display large numbers by default. You can change that, but your changes won't be saved when you save it to CSV, because the CSV file format doesn't include that sort of information. Try opening the file in Notepad++ and seeing the real format of the file.

nodejs- mysql invalid utf8 characer string load data in file

I am writng the data returned from paypal into a csv file and then load this csv file into mysql database.
Part of data returned from Paypal : %f0%9f%98%9d%2e
Which i then decode using decodeuricomponent to 😠.
when i save this to file and do load data in file it gives mysql error:
Invalid utf8 character string
The load data in file query is:
LOAD DATA LOCAL INFILE 'note.csv' REPLACE INTO TABLE table1 CHARACTER SET UTF8 FIELDS TERMINATED BY ',' ENCLOSED BY '\"' LINES TERMINATED BY '\n' (note)
The collation of database, table and columns is utf_general_ci
So i tried encoding the data returned from paypal using
utf8.encode("😠.")
And then saved it to file. But it was not encoded correctly.
After this the load data in file didn't return any error but the data saved in column was ð. Which is incorrect.
How do i correctly encode the string so that it can be loaded into the table.
Link to files:
http://speedy.sh/RMu5M/note.csv (encoded)
http://speedy.sh/ajrQj/notenonencoded.csv (nonencoded)
For 😝 (hex f09f989d), you need MySQL's utf8mb4, not utf8. Use that in the table and in the connection.