Export sqlite DB to csv issue - csv

When I used db.execSQL(.mode csv) in Java code, it generates error in logcat.
/AndroidRuntime( 1363): FATAL EXCEPTION: main
/AndroidRuntime( 1363): android.database.sqlite.SQLiteException: near ".": syntax error: .mode csv
but if I issue the same in sqlite console, it works. I also cannot set separator in java code.
sqlite> .mode csv
.mode csv
sqlite> .separator ,
.separator ,
sqlite>
Can anyone share experience with me or what is correct approach ? I will be appreciated if codes are provided.
Thanks !!

The .mode csv syntax and other dot-commands are proper to the sqlite shell, which is a specific program built on SQLite.
What you can do in java is just use the Database engine, not the .mode, .help, .quit or .separator from another program.
However you can find the source code for the SQLite shell here.
fossil clone http://www.sqlite.org/src _my_sqlite_repository
mkdir SQLite_source
cd SQLite_source
fossil open ../_my_sqlite_repository
Then you can download the latest updates with fossil update trunk and see the source code at src/shell.c. You will probably notice that this is the only piece of source code around that includes additional libraries.

Related

What is the correct syntax for the SOURCE command in SQL

In codeAnywhere I'm trying to run pre written script files to create a table. When using codeAnywhere one must import the file to the shell for the code first, as I have done. However I have been unable to use the SOURCE command to run these files. I have currently attempted this syntax:
USE exams SOURCE students.txt;
What is the correct syntax here? Do I need to name the database in the syntax?
Are there other commands which run text files containing code?
EDIT: I tried using this syntax, to the following result:
ERROR: Failed to open file 'exams(question5.txt)', error: 2
Put the commands on separate lines, without semi-colons for the shell commands, and if this doesn't work, then prefix with \ as well (I don't need to on my setup, but it's in the docs):
USE exams
SOURCE students.txt
https://dev.mysql.com/doc/mysql-shell-excerpt/5.7/en/mysql-shell-commands.html
On the shell you can use the following command to execute the queries from a text file:
mysql db_name < text_file
Hint: If the USE command (with correct database name) is specified on the textfile you don't need to specify the database. The SOURCE command is not available on MySQL instead you need the <.
You can find more information about executing queries from text files here:
https://dev.mysql.com/doc/refman/5.7/en/mysql-batch-commands.html

Error when using export/import SQL file from phpMyAdmin

There's something I've done a hundred times: exporting a mysql database from one server and importing it into another. The export function provides an .sql file which then gets imported to the new server. However, my servers recently updated their phpMyAdmin version (currently 4.6.0) and now whenever I try to do that I get an error when trying to import. I think that has something to do with the escaping as one of the lines now looks like that in the exported file:
(5, 'that\\\'s not even', '2014-05-25 22:35:51', 0)
That is a part of INSERT statement for one of the tables and the triple \\\ is what bothers me. I've tried to look around the configuration and find something related to the escaping but alas no luck. No sure if that's the issue really but any tip on what might be wrong and how to fix it is more than welcome.
EDIT:
In face, that line seems to have nothing in common with the error. The error that gets displayed on import is the following:
Static analysis:
1 errors were found during analysis.
Ending quote ' was expected. (near "" at position 2615077)
After that a very long query follows and I also don't know if that's relevant or not but it ends with this following line which is far from being last:
(33, 'active_plugins', 'a:2:{i:0;s:37:"admin-in-english/admin-in-english.php";i:1;s:29:"filedownload/filedownload.php";}', 'yes'),
That last one in particular is from a bunch of WordPress tables in the database if that matters.
EDIT2:
And here's something even more interesting. I keep backups of old database dumps so I tried to import a dump from a couple of months back that definitely imported successfully back then. Right now, same file, but error once I try to do the import...
After a lot of headbanging it turns out that the problem was limitations imposed by PHP for files larger than 6MB. After 6MB of query it would just cut it right there and logically throw and error afterwards.
The solution is either to change them or in my case, as I don't have direct access to the configuration files: SSH import worked successfully.

Importing csv file into Cassandra

I am using COPY command to load the data from csv file into Cassandra table . Following error occurs while using the command.
**Command** : COPY quote.emp(alt, high,low) FROM 'test1.csv' WITH HEADER= true ;
Error is :
get_num_processess() takes no keyword argument.
This is caused by CASSANDRA-11574. As mentioned in the ticket comments, there is a workaround:
Move copyutil.c somewhere else. Same thing if you also have a copyutil.so.
You should be able to find these files under pylib/cqlshlib/.

JSON to append in Big Query CLI using write_disposition=writeAppend fails

I could not make BQ shell to append JSON file using the keyword --write_disposition=WRITE_APPEND.
load --sour_format=NEWLINE_DELIMITED_JSON --write_disposition=WRITE_APPEND dataset.tablename /home/file1/one.log /home/file1/jschema.json
I have file named one.log and its schema jschema.json.
While executing the script, it says
FATAL flags parsing error : unknown command line flag 'write_dispostion'
RUN 'bq.py help' to get help.
I believe Big query is append only mode, there should be possibility of appending data in table, I am unable to get workaround, any assistance please.
I believe the default operational mode is WRITE_APPEND using the BQ tool.
And there is no --write_disposition switch for the BQ shell utility.
But there is a --replace should set the write_disposition to truncate.

SSIS - Export multiple SQL Server tables to multiple text files

I have to move data between two SQL Server DBs. My task is to export the data as text (.dat) files, move the files and import into the destination. I have to migrate over 200 tables.
This is what I tried
1) I used a Execute SQL task to fetch my tables.
2) Used a For each loop to loop through the table names from the collection.
3) Used a script task inside the for each loop to build the text file destination path.
4) Called a DFT with the table name in a variable for the source ole db and the path name in a variable for the destination flat file.
First table extracts fine but the second table bombs with a synchronization error. I see this is numerous posts but could not find one that matches my scenario. Hence posting here.
Even if I get the package to work with multiple DFTs, the second table from the second DFT does not export columns because the flat file connection manager still remembers the first table columns. Is there a way to get it to forget the columns?
Any thoughts on how I can export multiple tables to multiple text files using one DFT using dynamic source and destination variable?
Thanks and appreciate your help.
Unfortunately Bulk Import Task only enable us to use format files effectively to map the columns between source and destinations. Bulk Import Task uses BULK INSERT TSQL command to import the data, to execute user should have the BULKADMIN server privilege.
Most of the companies would not allow BULKADMIN server privilege to enable due to security reasons.
Hence using the script task to construct BCP statements is a good and simple option to Export.
You does not require to construct .bat file as script itself can execute dos commands which runs under .NET security account.
I figured out a way to do this. I thought I will share if anybody is stuck in the same situation.
So, in summary, I needed to export and import data via files. I also wanted to use a format file if at all possible for various reasons.
What I did was
1) Construct a DFT which gets me a list of table names from the DB that I need to export. I used 'oledb' as a source and 'recordset destination' as target and stored the table names inside a object variable.
A DFT is not really necessary. You can do it any other way. Also, in our application, we store the table names in a table.
2) Add a 'For each loop container' with a 'For Each ADO Enumerator' which takes my object variable from the previous step into the collection.
3) Parse the variable one by one and construct BCP statements like below inside a Script task. Create variables as necessary. The BCP statement will be stored in a variable.
I loop through the tables and construct multiple BCP statements like this.
BCP "DBNAME.DBO.TABLENAME1" out "PATH\FILENAME2.dat" -S SERVERNAME -T -t"|" -r$\n -f "PATH\filename.fmt"
BCP "DBNAME.DBO.TABLENAME1" out "PATH\FILENAME2.dat" -S SERVERNAME -T -t"|" -r$\n -f "PATH\filename.fmt"
The statements are put inside a .bat file. This is also done inside the script task.
4) A execute process task will next execute the .BAT file. I had to do this because, I do not have the option to use the 'master..xp_cmdShell' command or the 'BULK INSERT' command in my company. If I had the option to execute cmdshell, I could have directly run the command from the package.
5) Again add a 'For each loop container' with a 'For Each ADO Enumerator' which takes my object variable from the previous step into the collection.
6) Parse the variable one by one and construct BCP statements like this inside a Script task. Create variables as necessary. The BCP statement will be stored in a variable.
I loop through the tables and construct multiple BCP statements like this.
BCP "DBNAME.DBO.TABLENAME1" in "PATH\FILENAME2.dat" -S SERVERNAME -T -t"|" -r$\n -b10000 -f "PATH\filename.fmt"
BCP "DBNAME.DBO.TABLENAME1" in "PATH\FILENAME2.dat" -S SERVERNAME -T -t"|" -r$\n -b10000 -f "PATH\filename.fmt"
The statements are put inside a .bat file. This is also done inside the script task.
The -b10000 was put so I can import in batches. Without this many of my large tables could not be copied due to less space in the tempdb.
7) Run the .bat file to import the file again.
I am not sure if this is the best solution. I still thought I will share what satisfied my requirement. If my answer is not clear, I would be happy to explain if you have any questions. We can also optimize this solution. The same can be done purely via VB Scripts but you have to write some code to do that.
I also created a package configuration file where I can change the DB name, server name, the data and format file locations dynamically.
Thanks.