MySql naming the automatically downloaded CSV file using first and last date - mysql

MySql query gives me data from the 2020-09-21 to 2022-11-02. I want to save the file as FieldData_20200921_20221102.csv.
Mysql query
SELECT 'datetime','sensor_1','sensor_2'
UNION ALL
SELECT datetime,sensor_1,sensor_2
FROM `field_schema`.`sensor_table`
INTO OUTFILE "FieldData.csv"
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
;
Present output file:
Presently I named the file as FieldData.csv and it is accordingly giving me the same. But I want the query to automatically append the first and last dates to this file, so, it helps me know the duration of data without having to open it.
Expected output file
FieldData_20200921_20221102.csv.

MySQL's SELECT ... INTO OUTFILE syntax accepts only a fixed string literal for the filename, not a variable or an expression.
To make a custom filename, you would have to format the filename yourself and then write dynamic SQL so the filename could be a string literal. But to do that, you first would have to know the minimum and maximum date values in the data set you are dumping.
I hardly ever use SELECT ... INTO OUTFILE, because it can only create the outfile on the database server. I usually want the file to be saved on the server where my client application is running, and the database server's filesystem is not accessible to the application.
Both the file naming problem and the filesystem access problem are better solved by avoiding the SELECT ... INTO OUTFILE feature, and instead writing to a CSV file using application code. Then you can name the file whatever you want.

Related

Change the date format in a csv .txt file before importing using mysqlbulkloader

For anybody looking for an answer in the future, the best way to do this is using the load data local infile option, where you will need to call out your particular columns and designate the columns you want to preprocess with an # symbol. like the example below. You may also need to update both your client and server side settings to allow the load data option to work.
in my code example below i wanted to change the createdate field from the m/d/y format to the mysql format of yyyy-mm-dd
LOAD DATA local INFILE 'C:/Users/username/Documents/SAP/SAP GUI/QNs.txt'
IGNORE
INTO TABLE tbl_quality_notification
FIELDS TERMINATED BY '|'
LINES starting by '|'
TERMINATED BY '\r\n'
IGNORE 6 ROWS
(#CreateDate,
NotificationNumber,
ItemNumber,
WorkOrder,
MaterialNumber,
MaterialDescription,
Disposition,
DeptResponsible,
OperationCaused,
DefQtyInt,
DefQtyExt,
TotalQty,
WcOpFoundAt,
OpDescription,
DefectLocation,
ProblemType,
ProblemDescription,
CauseCode,
CauseDescription,
RootCause,
CorrectiveAction)
SET CreateDate = STR_TO_DATE(#CreateDate, '%m/%d/%Y')
Thanks GMB, load local infile seems to be the right solution. i will post a new thread to address the invalid character string issues.

Trouble loading data from txt file into MySQL with LOAD DATA INFILE

I am having a hard time loading my data to MySQL from a text file. I have been attempting to choose the correct delimiters but my file contains column names with each value.
The data is structured like this
{"id":"15","name":"greg","age":"32"}
{"id":"16","name":"jim","age":"42"}
the sql statement I am working on looks something like this currently
LOAD DATA LOCAL INFILE '/xxx.txt' INTO TABLE t1 FIELDS
TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' LINES TERMINATED BY '\r\n'(id, name, age);
results are being stored like this
{"id":"16", "name","greg"}
I need to do away with the column name and store the value.
any tips?
Writing a script as suggested by #Shadow will be easier to wrap around but you can check the section on json on this page how to import json-text-xml and csv data into mysql
or
Import JSON to MySQL made easy with the MySQL Shell if you are using MySQL Shell 8.0.13 (GA)

MariaDB CSV output limitations

I have some data in a MariaDB 10.2 database that I'm trying to do the following to:
Compose a JSON document using nested JSON_SET calls
Export a list of query results into a CSV file for use later on in another tool.
These JSON documents can vary in size, depending on what the field values are, but in general, it looks like they're in the range of 250-400 characters long. The issue I am running into is that if the JSON documents are getting truncated to 278 characters (often resulting in malformed JSON that cannot be used).
Is there a configuration or query parameter that I can use to configure this? I tried Googling for it earlier, but so far I've been unable to find anything.
Would appreciate any help!
As an example, the query looks like:
SELECT field_1, JSON_SET(JSON_SET('{"foo":{}}', '$.foo.bar', field_2), '$.foo.baz', field_3)
FROM test
INTO OUTFILE '/tmp/test.csv'
FIELDS TERMINATED BY '|'
LINES TERMINATED BY '\n';

How can I load 10,000 rows of test.xls file into mysql db table?

How can I load 10,000 rows of test.xls file into mysql db table?
When I use below query it shows this error.
LOAD DATA INFILE 'd:/test.xls' INTO TABLE karmaasolutions.tbl_candidatedetail (candidate_firstname,candidate_lastname);
My primary key is candidateid and has below properties.
The test.xls contains data like below.
I have added rows starting from candidateid 61 because upto 60 there are already candidates in table.
please suggest the solutions.
Export your Excel spreadsheet to CSV format.
Import the CSV file into mysql using a similar command to the one you are currently trying:
LOAD DATA INFILE 'd:/test.csv'
INTO TABLE karmaasolutions.tbl_candidatedetail
(candidate_firstname,candidate_lastname);
To import data from Excel (or any other program that can produce a text file) is very simple using the LOAD DATA command from the MySQL Command prompt.
Save your Excel data as a csv file (In Excel 2007 using Save As) Check
the saved file using a text editor such as Notepad to see what it
actually looks like, i.e. what delimiter was used etc. Start the MySQL
Command Prompt (I’m lazy so I usually do this from the MySQL Query
Browser – Tools – MySQL Command Line Client to avoid having to enter
username and password etc.) Enter this command: LOAD DATA LOCAL INFILE
‘C:\temp\yourfile.csv’ INTO TABLE database.table FIELDS TERMINATED
BY ‘;’ ENCLOSED BY ‘”‘ LINES TERMINATED BY ‘\r\n’ (field1, field2);
[Edit: Make sure to check your single quotes (') and double quotes (")
if you copy and paste this code - it seems WordPress is changing them
into some similar but different characters] Done! Very quick and
simple once you know it :)
Some notes from my own import – may not apply to you if you run a different language version, MySQL version, Excel version etc…
TERMINATED BY – this is why I included step 2. I thought a csv would default to comma separated but at least in my case semicolon was the deafult
ENCLOSED BY – my data was not enclosed by anything so I left this as empty string ”
LINES TERMINATED BY – at first I tried with only ‘\n’ but had to add the ‘\r’ to get rid of a carriage return character being imported into the database
Also make sure that if you do not import into the primary key field/column that it has auto increment on, otherwhise only the first row will be imported
Original Author reference

Is there a way to insert custom data while loading data from a file in MySQL?

I am using the following statement to load data from a file into a table:
LOAD DATA LOCAL INFILE '/home/100000462733296__Stats"
INTO TABLE datapoints
FIELDS TERMINATED BY '|'
LINES TERMINATED BY '\n'
(uid1, uid2, status);
Now, if I want to enter a custom value into uid1, say 328383 without actually asking it to read it from a file, how would I do that? There are about 10 files and uid1 is the identifier for each of these files. I am looking for something like this:
LOAD DATA LOCAL INFILE '/home/100000462733296__Stats"
INTO TABLE datapoints
FIELDS TERMINATED BY '|'
LINES TERMINATED BY '\n'
(uid1="328383", uid2, status);
Any suggestions?
The SET clause can be used to supply values not derived from the input file:
LOAD DATA LOCAL INFILE '/home/100000462733296__Stats"
INTO TABLE datapoints
FIELDS TERMINATED BY '|'
LINES TERMINATED BY '\n'
(uid1, uid2, status)
SET uid1 = '328383';
It's not clear what the data type of uid1 is, but being that you enclosed the value in double quotes I assumed it's a string related data type - remove the single quotes if the data type is numeric.
There's more to read on what the SET functionality supports in the LOAD FILE documentation - it's a little more than 1/2 way down the page.
You could use a python interactive shell instead of MySQL shell to interactvely provide values for MySQL tables.
Install the python inerpreter from python.org (only needed if you are under windows, otherwise you have it already), and the mysql connector from http://sourceforge.net/projects/mysql-python/files/ (ah, I see you are on Lunux/Unix --just install teh mysqldb package then)
After that, you type these three lines in the python shell:
import MySQLdb
connection = MySQLdb.connect(" <hostname>", "< user>", "<pwd>", [ "<port>"] )
cursor = connection.cursor
Adter that you can use the cursor.execute method to issue SQL statements, but retaining th full flexibility of python to change your data.
For example, for this specific query:
myfile = open("/home/100000462733296__Stats")
for line in file:
uid1, uid2, status = line.split("|")
status = status.strip()
cursor.execute("""INSERT INTO datapoints SET uid1="328383", uid2=%s, status=%s""" %(uid2, status) )
voilá !
(maybe with a try: clause around the the "line.split " line to avoid an exception on the last line)
If you don't already, you may learn Python in under one hour with the tutorial at python.org
-- it is really worth it, even if the only things you do at computers is to import data into databases.
2 quick thought (one might be applicable :)):
change the value of uid1 in the file to 328383 in every line.
temporarily change the uid1 column in the table to be non-mandatory, load the contents of the file, then run a query that sets the value to 328383 in every row. Finally, reset the column to mandatory.