automatic Export Data From Database to CSV File - mysql

i have to generate a CSV file of full database/table when any new row comes in table.
so is there any script using i can generate CSV file.
i use MySQL to store data in database from HTML Form.
plz help

Finally, I found a beautiful tutorial to export data from Database to CSV File.
Also one answer on Stackoverflow.
sqlcmd -S . -d DatabaseName -E -s, -W -Q "SELECT * FROM TableName" > C:\Test.csv

Alternatively, you can use Skyvia, a cloud solution with native support for CSV file export from MySQL database. Just type in the query or use the Query Designer for a no-code solution then export the results to CSV. See an example below:
mysql-query-to-csv
After the results appear, simply click the CSV button and a CSV download will appear in your browser.

Related

How to import JSON file in PostgreSQL: COPY 1

I'm new at PostgreSQL. I'm trying to import JSON file into PostgreSQL table. I created an empty table:
covid19=# CREATE TABLE temp_cov(
covid19(# data jsonb
covid19(# );
and tried to copy my data from JSON in this table with this command in Command line:
cat output.json | psql -h localhost -p 5432 covid19 -U postgres -c "COPY temp_cov (data) FROM STDIN;"
The output was just "COPY 1" and when I open my table in psql with
SELECT * FROM temp_cov;
But this command goes without an end and with this output.
Unfortunately, I couldn't find an answer or some similar problem solution. Thank you in advance for your advices.
Also my json file is already modified to "not pretty" form and it has over than 11k lines.
Your data is there. psql is sending the row to the pager (likely more?), and the pager can't deal with it very usably because it is too big. You can turn off the pager (\pset pager off inside psql) or set the pager to a better program (PAGER=less or PSQL_PAGER=less as environment variables), but really none of those is going to be all that useful for viewing giant JSON data.
You have your data in PostgreSQL, now what do you want to do with it? Just looking at it within psql's pager is unlikely to be interesting.

How to convert dbase III files to mysql?

is ist possible to convert .DBF files to any other format?
Does anybody knows a script, that can be used to convert .DBF files to an mysql query.
It would be also fine, to convert the DBF files to CSV files.
I always got problems with the codec of the DBF files.
Konstantin
https://www.dbase.com/Knowledgebase/faq/import_export_data.asp
Q: How do I export data from a dBASE table to a text file?
A: Exporting data from dBASE to a text file is handled through the COPY TO command.
Like the APPEND FROM command, there are a number of ways to use this command. Here we are only interested in it's most basic use. Once you understand how to use this command, you can go to your on-line help for further details on what can be accomplished with the COPY TO command.
In order to export data you must first be using the table from which the data will be exported. As before, you will be employing the USE command in the command window.
USE <tablename>
For example:
USE Mytest.dbf
Once the table is in use, all you need to do is type the following command in the command window:
COPY TO <filename> TYPE DELIMITED
For example:
COPY TO Myexport.txt TYPE DELIMITED
This would result in a file being created in the current directory called Myexport.txt which would be in the DELIMITED or *.CSV format.
If we had wanted to export the data in the *.SDF format, we would have typed:
COPY TO Myexport.txt TYPE SDF
This would result in a file being created in the current directory called Myexport.txt which would be in the System Delimted or *.SDF format.
Those are the basics on how to import and export text data into a dBASE table. For further information consult the on-line help for the APPEND FROM and COPY TO commands.
I converted old (circa 1997) DBF files to CSV using Python and the dbfread module.
After installation of Python, from the Python interpreter (<WIN> + 'Python') install the dbfread module:
>>> pip install dbfread
The module has many method to read DBF files and excellent documentation.
Then a Python script does the job, or typing directly into the interpreter:
# Read the DBF file
table = DBF('C:/my_dbf_file.dbf', encoding='1252')
outFileName = 'C:/my_export.csv'
with open(outFileName, 'w', newline='', encoding='1252' ) as file:
writer = csv.writer(file)
writer.writerow(table.field_names)
for record in table:
writer.writerow(list(record.values()))
Note that each record in the database is read and save one at a time and that the first line of the CSV file are the column's names.
Encoding could be problematic, a list of encoding to try is here: The dbread.DBF() method tries to guess the encoding but is not perfect. This is why in the code I specify the parameters encoding in both DBF() and csv.open().

How to store sql query response with dotted lines into a text file?

I have a simple question I want be able to store sql query response with dotted lines. So when I hit mysql with command line interface like mysql -h${sqlhost} -u${sqluser} -p${sqlpass} -e "SELECT * FROM test.employee" > output.txt I should be able to store structured output in the text file - in linux environment even Windows would do.
I should be able to store the structured view above into a say 'output.txt' file.
Use the --table switch, as per the documentation: https://dev.mysql.com/doc/refman/5.7/en/mysql-shell-output-table-format.html

Export a MYSQL column to a plain txt file with no headings

So what I'm trying to do is write a script or CRON job (Linux- CentOS) to export the usernames listed in my wordpress database to a simple .txt file with just on username per line. So with the picture, I would like the .txt file to read like this:
Sir_Fluffulus
NunjaX007
(Except with all the username in the user_login column.)
See screenshot at:
I have found how to export the entire table to a CVS file, but that contains about 10+ fields (Columns) that I DO NOT what to show up in this text file.
Can anyone point me in the right direction on how to do this?
If it helps, this is going to be for exporting users that have signed up on our website (Wordpress) to a whitelist.txt file for Minecraft. Thanks!
Pass a query into the mysql tool, and use silent mode.
$ mysql -u username dbname -s <<< 'SELECT fieldname FROM tablename'
Sir_Fluffulus
NunjaX007

How to export sqlite into CSV using RSqlite?

How to export sqlite into CSV using RSqlite?
I am asking because I am not familiar with database files, so I want to convert it using R.
It may be very simple, but I haven't figure out.
not quite sure if you have figured this out. I am not quite sure how to do it within R either but it seems pretty simple to export to csv using SQLite itself, or by writing out csv from the database you have loaded to R.
In SQLite, you can do something like this at your command prompt
>.mode csv
>.export output.csv
>.header on
>select * from table_name;
>.exit
SQLite will automatically wrote out your table to a output.csv file
If the table is not too large, you can first export it into an data frame or matrix in R using the dbGetQuery or the dbSendQuery and fetch commands. Then, you can write that data frame as a .csv file.
my.data.frame <- dbGetQuery(My_conn, "SELECT * FROM My_Table")
write.csv(my.data.frame, file = "MyFileName.csv", ...)