MariaDB CSV output limitations - mysql

I have some data in a MariaDB 10.2 database that I'm trying to do the following to:
Compose a JSON document using nested JSON_SET calls
Export a list of query results into a CSV file for use later on in another tool.
These JSON documents can vary in size, depending on what the field values are, but in general, it looks like they're in the range of 250-400 characters long. The issue I am running into is that if the JSON documents are getting truncated to 278 characters (often resulting in malformed JSON that cannot be used).
Is there a configuration or query parameter that I can use to configure this? I tried Googling for it earlier, but so far I've been unable to find anything.
Would appreciate any help!
As an example, the query looks like:
SELECT field_1, JSON_SET(JSON_SET('{"foo":{}}', '$.foo.bar', field_2), '$.foo.baz', field_3)
FROM test
INTO OUTFILE '/tmp/test.csv'
FIELDS TERMINATED BY '|'
LINES TERMINATED BY '\n';

Related

In MYSQL, how to upload a csv file that contains a date in the format of '1/1/2020' properly into a DATE data type format (standard YYYY-MM-DD)

I have a column of data, let's call it bank_date, that I receive from an external vendor as a csv file every day. As such the dates in that column show as '1/1/2020'.
I am trying to upload that raw csv file directly to SQL daily. We used to store the SQL bank_date format as text, but we have converted it to a Data data type, and now it keeps zero'ing out every time, with some sort of truncate / "datetime value incorrect" error.
I have now tested 17 different versions of utilizing STR_TO_date (mostly), CAST, and CONVERT, and feel like I'm close, but I'm not quite getting the syntax right.
Also for reference, I did find 2 other workarounds that are successful, but my boss specifically wants it uploaded and converted directly through the import process (not manipulating the raw csv data) for safety reasons. For reference:
Workaround 1: Convert csv date column to the YYYY-MM-DD format and save file. The issue with this is that if you try to open that CSV file again, it auto-changes the date format back to the standard mm/dd/yyyy. If someone doesn't know to watch out for this and is re-opening the csv file to double check something, they're gonna find an error when they upload, and the problem is not easy to identify.
Workaround 2:Create an extra dummy_date column in the table that is formatted as a text data type and upload as normal. Then copy and paste the data into the correct bank_date column using a str_to_date function as follows: UPDATE dummy_date SET bank_date = STR_TO_DATE(dummy_date, ‘%c/%e/%Y’); The issue with this is that it just creates extra unnecessary data that can be confused when other people may not know that 1 of the columns is not intended for querying.
Here is my current code:
USE database_name;
LOAD DATA LOCAL INFILE 'C:/Users/Shelly/Desktop/Date Import.csv'
INTO TABLE bank_table
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
IGNORE 1 ROWS
(bank_date, bank_amount)
SET bank_date = str_to_date(bank_date,'%Y-%m-%d');
The "SET" line is what I cannot work out on syntax to convert a csv's 1/5/2020' to SQL's 2020-1-5 format. Every test I've made either produces 0000-00-00 or nulls the column cells. I'm thinking maybe I need to tell SQL how to understand the csv's format in order for it to know how to convert it. Newbie here and stuck.
You need to specify a format for a date that is in the file, not a "required" one:
SET bank_date = str_to_date(bank_date,'%c/%e/%Y');

MySql naming the automatically downloaded CSV file using first and last date

MySql query gives me data from the 2020-09-21 to 2022-11-02. I want to save the file as FieldData_20200921_20221102.csv.
Mysql query
SELECT 'datetime','sensor_1','sensor_2'
UNION ALL
SELECT datetime,sensor_1,sensor_2
FROM `field_schema`.`sensor_table`
INTO OUTFILE "FieldData.csv"
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
;
Present output file:
Presently I named the file as FieldData.csv and it is accordingly giving me the same. But I want the query to automatically append the first and last dates to this file, so, it helps me know the duration of data without having to open it.
Expected output file
FieldData_20200921_20221102.csv.
MySQL's SELECT ... INTO OUTFILE syntax accepts only a fixed string literal for the filename, not a variable or an expression.
To make a custom filename, you would have to format the filename yourself and then write dynamic SQL so the filename could be a string literal. But to do that, you first would have to know the minimum and maximum date values in the data set you are dumping.
I hardly ever use SELECT ... INTO OUTFILE, because it can only create the outfile on the database server. I usually want the file to be saved on the server where my client application is running, and the database server's filesystem is not accessible to the application.
Both the file naming problem and the filesystem access problem are better solved by avoiding the SELECT ... INTO OUTFILE feature, and instead writing to a CSV file using application code. Then you can name the file whatever you want.

Trouble loading data from txt file into MySQL with LOAD DATA INFILE

I am having a hard time loading my data to MySQL from a text file. I have been attempting to choose the correct delimiters but my file contains column names with each value.
The data is structured like this
{"id":"15","name":"greg","age":"32"}
{"id":"16","name":"jim","age":"42"}
the sql statement I am working on looks something like this currently
LOAD DATA LOCAL INFILE '/xxx.txt' INTO TABLE t1 FIELDS
TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' LINES TERMINATED BY '\r\n'(id, name, age);
results are being stored like this
{"id":"16", "name","greg"}
I need to do away with the column name and store the value.
any tips?
Writing a script as suggested by #Shadow will be easier to wrap around but you can check the section on json on this page how to import json-text-xml and csv data into mysql
or
Import JSON to MySQL made easy with the MySQL Shell if you are using MySQL Shell 8.0.13 (GA)

LOAD DATA FROM S3 command failing because of timestamp

I'm running the "LOAD DATA FROM S3" command to load a CSV file from S3 to Aurora MySQL. The command works fine if run it in the Mysql Workbench (it gives me the below exception as warnings though but still inserts the dates fine), but when I run it in Java I get the following exception:
com.mysql.cj.jdbc.exceptions.MysqlDataTruncation:
Data truncation: Incorrect datetime value: '2018-05-16T00:31:14-07:00'
Is there a workaround? Is there something I need to setup on the mysql side or in my app to make this transformation seamless? Should I somehow run a REPLACE() command on the timestamp?
Update 1:
When I use REPLACE to remove the "-07:00" from the time original timestamp (2018-05-16T00:31:14-07:00) it loads the data appropriately. Here's my load statement:
LOAD DATA FROM S3 's3://bucket/object.csv'
REPLACE
INTO TABLE sample
FIELDS TERMINATED BY '\t'
LINES TERMINATED BY '\n'
IGNORE 1 LINES
(#myDate)
SET `created-date` = replace(#myDate, '-07:00', ' ');
For obvious reasons it's not a good solution. Why would the LOAD statement work in the mysql workbench and not in my java code? Can I set some parameter to make it work? Any help is appreciated!!
The way I solved it is by using mysql's SUBSTRING function in the 'SET' part of the LOAD DATA query (instead of the 'replace'):
SUBSTRING(#myDate, 1, 10)
This way the trailing '-07:00' was removed (I actually opted to remove the time as well, since I didn't need it, but you can use it for TIMESTAMPS as well.

How can I load 10,000 rows of test.xls file into mysql db table?

How can I load 10,000 rows of test.xls file into mysql db table?
When I use below query it shows this error.
LOAD DATA INFILE 'd:/test.xls' INTO TABLE karmaasolutions.tbl_candidatedetail (candidate_firstname,candidate_lastname);
My primary key is candidateid and has below properties.
The test.xls contains data like below.
I have added rows starting from candidateid 61 because upto 60 there are already candidates in table.
please suggest the solutions.
Export your Excel spreadsheet to CSV format.
Import the CSV file into mysql using a similar command to the one you are currently trying:
LOAD DATA INFILE 'd:/test.csv'
INTO TABLE karmaasolutions.tbl_candidatedetail
(candidate_firstname,candidate_lastname);
To import data from Excel (or any other program that can produce a text file) is very simple using the LOAD DATA command from the MySQL Command prompt.
Save your Excel data as a csv file (In Excel 2007 using Save As) Check
the saved file using a text editor such as Notepad to see what it
actually looks like, i.e. what delimiter was used etc. Start the MySQL
Command Prompt (I’m lazy so I usually do this from the MySQL Query
Browser – Tools – MySQL Command Line Client to avoid having to enter
username and password etc.) Enter this command: LOAD DATA LOCAL INFILE
‘C:\temp\yourfile.csv’ INTO TABLE database.table FIELDS TERMINATED
BY ‘;’ ENCLOSED BY ‘”‘ LINES TERMINATED BY ‘\r\n’ (field1, field2);
[Edit: Make sure to check your single quotes (') and double quotes (")
if you copy and paste this code - it seems WordPress is changing them
into some similar but different characters] Done! Very quick and
simple once you know it :)
Some notes from my own import – may not apply to you if you run a different language version, MySQL version, Excel version etc…
TERMINATED BY – this is why I included step 2. I thought a csv would default to comma separated but at least in my case semicolon was the deafult
ENCLOSED BY – my data was not enclosed by anything so I left this as empty string ”
LINES TERMINATED BY – at first I tried with only ‘\n’ but had to add the ‘\r’ to get rid of a carriage return character being imported into the database
Also make sure that if you do not import into the primary key field/column that it has auto increment on, otherwhise only the first row will be imported
Original Author reference