I have a csv file and I need to transfer all its data to MySQL with a click of a button using ASP Classic. I have searched a lot but I couldn't find an answer.
I would upload the file first using something like Pure ASP File Upload (there are alternatives) and then using this solution to read the file and then process the contents How to read CSV files line by line in VBScript
What pee2pee said is the best method, as you will need a script in ASP-Classic that can handle the file upload. MySQL has a default directory it uses to exchange files that you will need to use unless you end up modifying --secure_file_priv and clear it like secure-file-priv = "". If you do this you can put the file anywhere and upload it, otherwise you have to check that variable with SHOW VARIABLES LIKE "secure_file_priv"; and see if it is cleared or locate and only use the directory stored in that variable.
Now this is the opposite of what you need, but I will give you this first. This is how you can get it to export data out of your database back into a CSV again. But keep in mind what I said about about the directory where you export or import files, as it won't work if you don't use the one it allows or unlock it to allow any, but keep in mind that later option is less secure, so you might set it back before deploying to a work environment unless you really trust your users.
SELECT * FROM table
INTO OUTFILE '/bin/export/yourdata.csv'
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n';
And the invert of that would be:
LOAD DATA INFILE 'yourdata.csv'
INTO TABLE t1
(column1, column2, column3, ...)
FIELDS TERMINATED BY ',' ENCLOSED BY '"' ESCAPED BY '"'
LINES TERMINATED BY '\n';`
Once you get the directory setup and perhaps file privileges granted if needed, then get the an ASP upload script such as the one mentioned above, and use something like the code I provided to get it going once you have the ability to upload the file via the magic of an ASP Classic HTML button click.
Related
MySql query gives me data from the 2020-09-21 to 2022-11-02. I want to save the file as FieldData_20200921_20221102.csv.
Mysql query
SELECT 'datetime','sensor_1','sensor_2'
UNION ALL
SELECT datetime,sensor_1,sensor_2
FROM `field_schema`.`sensor_table`
INTO OUTFILE "FieldData.csv"
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
;
Present output file:
Presently I named the file as FieldData.csv and it is accordingly giving me the same. But I want the query to automatically append the first and last dates to this file, so, it helps me know the duration of data without having to open it.
Expected output file
FieldData_20200921_20221102.csv.
MySQL's SELECT ... INTO OUTFILE syntax accepts only a fixed string literal for the filename, not a variable or an expression.
To make a custom filename, you would have to format the filename yourself and then write dynamic SQL so the filename could be a string literal. But to do that, you first would have to know the minimum and maximum date values in the data set you are dumping.
I hardly ever use SELECT ... INTO OUTFILE, because it can only create the outfile on the database server. I usually want the file to be saved on the server where my client application is running, and the database server's filesystem is not accessible to the application.
Both the file naming problem and the filesystem access problem are better solved by avoiding the SELECT ... INTO OUTFILE feature, and instead writing to a CSV file using application code. Then you can name the file whatever you want.
Hey I have a large database where customers request data that is specific to them. They usually send me the requests in a text or csv file. I was wondering if there is a way to get sql to read that file and take the content and put them into a sql query. This way I don't have to open up that file and copy and paste everything into a sql query.
Steve already answered it.
Let me add few words only.
you can not use the csv, text,excel or anyother format directly in
query for DML/DDL.. you can use file directly only for export/import.
No. MySQL is not designed to do this.
You need an intermediate script that can interpret the files and generate the queries you require.
Yes there is a way to do it: you can import the csv file to your database and then join it with any query you want.
You can load the csv file with an SQL query such as:
LOAD DATA INFILE "/tmp/test.csv"
INTO TABLE test
COLUMNS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
ESCAPED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES;
You can use other ways to import the data, see: How to import CSV file to MySQL table.
I tried this SQL solution in Ubuntu 14.04 with MySQL 5.6. For this to work you will have to put the test.csv file in the /tmp directory and do a chmod 755 test.csv for it to work. Otherwise MySQL is gives "Permission denied" errors. More about this issue on: LOAD DATA INFILE Error Code : 13
Good Day
I have created a bat file to import a text file to my MySQL database and it looks as follows:
sqlcmd /user root /pass password /db "MyDB" /command "LOAD DATA LOCAL INFILE 'file.csv' INTO TABLE TG_Orders FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'"
My problem is that I cannot get the "Treat consecutive delimiters as one" to work...
How would I add that?
Now that we have actually got to the real crux of the problem, this is not a consecutive delimiter problem - it's a CSV file format problem.
If your CSV file contains fields like B121,535 and they are not enclosed within quote marks of some kind and your delimeter is , then no amount of SQL jiggery-pokery will sort out your problem. Un-quoted fields with commas like this will always be interpreted as two separate fields unless enclosed within quote marks.
Post a sample line from the CSV file which is causing problems and we can diagnose further. Failing that, export the data from the initial system again making sure that the formatting is correct (either enclose everything in speech marks or just string fields)
Finally, are you sure that your database is MySQL based and not Microsoft SQL? The only references to SQLCMD.EXE I can find all point to Microsoft sites in relation to SQL Server Express but, even then, it has a different option structure (-U for user rather than /user). If this is the case you could have saved a lot of hassle by putting the correct information tags. If not then I would say that SQLCMD.EXE is a custom written application from somewhere and the problem could all stem from that. If that is the case then we can't help if the CSV formatting is correct - you're on your own
So I am trying to access my servers databse reomtely and have it run commands to export several tables all to individual csv files. So what I have is a commmand line command parameters that look like this:
mysql -h 198.xxx.xxx.xxx -u user-p < file.txt
The contents of file.txt looks like this:
SELECT * FROM log
INTO OUTFILE 'C:\USERS\username\Desktop\log.csv'
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
SELECT * FROM permission_types
INTO OUTFILE 'C:\USERS\username\Desktop\permission_types.csv'
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
SELECT * FROM personal_info_options
INTO OUTFILE 'C:\USERS\username\Desktop\personal_info_options.csv'
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
I am not sure that I have the syntax for this right or if this is even possible I have been doing a bunch of research trying to get examples. People usually tell you concept but never seem to give you the code you need to test them its always something like:
mysql -h localhost -u user-p < somefile and they don't show you contents
of a file for example
I am running windows 7, I installed WAMPServer and it has MYSQL version 5.5.24, which I am access via commandline. I am not sure about the FILEDS TERMINATED BY or the ENCLOSED BY or LINES TERMINATED BY... do I need those at all? Will that actually save to my local machine? I am nervous about running this script I don't want to make a mistake and mess up the database. Also is .txt ok for the script file?
Any help you can give would be great.
I am not sure that I have the syntax for this right or if this is even possible I have been doing a bunch of research trying to get examples.
Your syntax is correct, except that each SELECT statement should be terminated with a semicolon. Note that you will also need to specify the database in which your tables reside—it's easiest to do this as an argument to mysql:
mysql -h 198.xxx.xxx.xxx -u user-p mydb < file.txt
I am not sure about the FILEDS TERMINATED BY or the ENCLOSED BY or LINES TERMINATED BY... do I need those at all?
As documented under SELECT ... INTO Syntax:
Here is an example that produces a file in the comma-separated values (CSV) format used by many programs:
SELECT a,b,a+b INTO OUTFILE '/tmp/result.txt'
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n'
FROM test_table;
As explained under LOAD DATA Syntax:
If you specify no FIELDS or LINES clause, the defaults are the same as if you had written this:
FIELDS TERMINATED BY '\t' ENCLOSED BY '' ESCAPED BY '\\'
LINES TERMINATED BY '\n' STARTING BY ''
It goes on to explain:
Conversely, the defaults cause SELECT ... INTO OUTFILE to act as follows when writing output:
Write tabs between fields.
Do not enclose fields within any quoting characters.
Use “\” to escape instances of tab, newline, or “\” that occur within field values.
Write newlines at the ends of lines.
Note that because LINES TERMINATED BY '\n' is the default, you could omit that clause; but the FIELDS clauses are necessary for CSV output.
Will that actually save to my local machine?
No. As documented under SELECT ... INTO Syntax:
The SELECT ... INTO OUTFILE 'file_name' form of SELECT writes the selected rows to a file. The file is created on the server host, so you must have the FILE privilege to use this syntax. file_name cannot be an existing file, which among other things prevents files such as /etc/passwd and database tables from being destroyed. The character_set_filesystem system variable controls the interpretation of the file name.
The SELECT ... INTO OUTFILE statement is intended primarily to let you very quickly dump a table to a text file on the server machine. If you want to create the resulting file on some other host than the server host, you normally cannot use SELECT ... INTO OUTFILE since there is no way to write a path to the file relative to the server host's file system.
However, if the MySQL client software is installed on the remote machine, you can instead use a client command such as mysql -e "SELECT ..." > file_name to generate the file on the client host.
It is also possible to create the resulting file on a different host other than the server host, if the location of the file on the remote host can be accessed using a network-mapped path on the server's file system. In this case, the presence of mysql (or some other MySQL client program) is not required on the target host.
I am nervous about running this script I don't want to make a mistake and mess up the database.
SELECT statements only read data from the database and do not make any changes to its content: thus they cannot "mess up the database".
Also is .txt ok for the script file?
Any extension will work: neither the MySQL client nor server software see it (your operating system reads the file and sends its content to the client program, which in turn sends that content to the server; the operating system is ambivalent to the file extension here).
On Windows, a .txt extension will associate the file with a text editor (e.g. Notepad) so that it can be readily opened for editing. Personally I would prefer .sql as it more accurately describes the file's content, and I would then associate that extension with a suitable editor—but none of that is necessary.
I have an excel file that i need to get into CSV. I export it fine but when I go to import it into a mysql db via phpMyAdmin i get a "Invalid field count in CSV input on line 1.".
Problem seems to be that the fields are not enclosed by double quotes. I just migrated to MS Excel 2007 and am not sure how to manipulate the CSV save options so that there are double quotes around the fields so my DB doesn't throw a conniption when i try to import.
Any suggestions? I'm fairly new at going from EXCEL to CSV but have gotten it to work previously.
Thanks
This worked for me after exporting from Excel as CSV and defining various options
load data infile '/tmp/tc_t.csv'
into table new_test_categories
fields terminated by ','
enclosed by '"'
lines terminated by '\n'
ignore 1 lines
(id,category_name,type_id,home_collection,seo_tags,status_id);
I ran this at the mysql prompt.
There should be an MS-DOS format of CSV in your export drop down. Pick that one.
There should be an option in save-as advanced properties or something, but if not, you could always change the delimiter character to : or ; or | and then write a quick perl script to convert it to a quote-and-comma file.
Or you could just try a tab-separated-value file instead, I think phpMyAdmin will read TSVs as well.