Export SQLite3 to CSV with text representation of GUID/BLOB - csv

I would like to export each table of my SQLite3 database to CSV files for further manipulation with Python and after that I want to export the CSV files into a different database format (PSQL). The ID column in SQLite3 is of type GUID, hence jiberrish when I export tables to CSV as text:
l_yQ��rG�M�2�"�o
I know that there is a way to turn it into a readable format since the SQLite Manager addon for Firefox does this automatically, sadly without reference regarding how or which query is used:
X'35B17880847326409E61DB91CC7B552E'
I know that QUOTE (GUID) displays the desired hexadecimal string, but I don't know how to dump it to the CSV instead of the BLOB.

I found out what my error was - not why it doesn't work, but how I get around it.
So I tried to export my tables as staded in https://www.sqlite.org/cli.html , namely a multiline command, which didn't work:
sqlite3 'path_to_db'
.headers on`
.mode csv
.output outfile.csv
SELECT statement
and so on.
I was testing a few things and since I'm lazy while testing, I used the single line variant, which got the job done:
sqlite3 -header -csv 'path_to_db' "SELECT QUOTE (ID) AS Hex_ID, * FROM Table" > 'output_file.csv'
Of course it would be better if I would specify all column names instead of using *, but this sufices as an example.

Related

How to form correct CSV line with postgres SQL

Is it possible to query Postgresql in order to get correct CSV line? For instance select concat (a,',',b) from t but with correctly escaped commas and quotes.
A couple of options.
Using psql
select * from some_table \g (format=csv) output.csv
This will create a CSV file named output.csv.
\copy cell_per to 'output.csv' WITH(format csv, header, delimiter '|');
The above allows you to use the options as explained here COPY to do things like change the delimiter, quoting, etc.
You can also use COPY directly as a query. Though in that case it is important to note that COPY runs as the server user and can only write files to directories the server user has permissions on. The work around is to make the output go to STDOUT and and capture it. For instance using the Python driver psycopg2 there are copy methods copy.

How to export data from Cassandra table having JSON value to a CSV file?

I have a table in Cassandra DB and one of the column has value in JSON format. I am using Datastax DevCenter for querying the DB and when I try to export the result to CSV, JSON value gets broken to separate column wherever there is coma(,). I even tried to export from command prompt without giving and delimiter, that too resulted in broken JSON value.
Is there anyway to achieve this task?
Use the COPY command to export the table as a whole with a different delimiter.
For example :
COPY keyspace.your_table (your_id,your_col) TO 'your_table.csv' WITH DELIMETER='|' ;
Then filter on this data programmatically in whatever way you want.

How to have column with character value equal to the enclosing character value in mysql load data in file

I'm using mysqlimport,which uses LOAD DATA INFILE command. My question is the following: Assume I have --fields-enclosed-by='"', and that I have column with values which have double quoted string, such as "5" object" (which stands for 5 inches). The problem is that when mysql encounter the double quote string after the 5, it treats it as the enclosing character, and things are messed up. How to use mysqlimport with such values? I don't want to just use another character to enclosing, because this other character as well may occur in the data. So what is a general solution for this?
I guess it is will be different this way to import csv.
To solve above issue in another way,
Export or get or convert old data into sql format rather than csv format.
Import the same sql data using mysql command line tool.
mysql -hservername -uusername -p'password' dbname < 'path to you sql imported file.sql'

How to import .txt to MySQL table

How do I import a .txt file into a MySQL table?
My .txt file is like this...
ex : AF0856427R1 000002200R HADISUMARNO GGKAMP MALANG WET 3 6 00705 AFAAADF16000-AD-FA P00.001.0 1 000001.00022947.70023290.00 T511060856425A 022014B
There are 39 fields in my file.
Try mysqlimport command
name of the text file should be the name of the table in which you want the data to be imported. For eg, if your file name is patient.txt, data will be imported into patient table
mysqlimport [options] db_name textfile
There are lot of options that you can pass in. Documentation here
Especially since some of your fields are terminated by spaces and some are based on string length, I would definitely first do some string manipulation with your favorite tool (sed, awk, and perl are all likely very good choices).
Create an intermediary comma separated file. If you have commas in the existing file, you can easily use some other character. The goal is to create a file that has one consistent separator.
You've used the phpMyAdmin tag, so from your table go to the Import tab, select the file, and pick CSV from the dropdown of file types. Edit the options according to what your file looks like (for instance, perhaps § is your column separator and you might leave the next two options blank). Then try the import and check the data to make sure it all arrived in the columns you expected.
Good luck.

How to export sqlite into CSV using RSqlite?

How to export sqlite into CSV using RSqlite?
I am asking because I am not familiar with database files, so I want to convert it using R.
It may be very simple, but I haven't figure out.
not quite sure if you have figured this out. I am not quite sure how to do it within R either but it seems pretty simple to export to csv using SQLite itself, or by writing out csv from the database you have loaded to R.
In SQLite, you can do something like this at your command prompt
>.mode csv
>.export output.csv
>.header on
>select * from table_name;
>.exit
SQLite will automatically wrote out your table to a output.csv file
If the table is not too large, you can first export it into an data frame or matrix in R using the dbGetQuery or the dbSendQuery and fetch commands. Then, you can write that data frame as a .csv file.
my.data.frame <- dbGetQuery(My_conn, "SELECT * FROM My_Table")
write.csv(my.data.frame, file = "MyFileName.csv", ...)