How to save Amazon Redshift output to local csv through SQL Developer - csv

I am trying save Amazon Redshift output to local csv through SQL Developer. Please suggest a solution other than the export wizard in SQL Developer to generate csv, as the data volume is in 10s of millions and the extract is required frequently. Spool command is creating file in local but unable to format it as a CSV file.

The only reason we support redshift's JDBC driver in SQL Developer is so you can migrate that data warehouse to Oracle Autonomous DW Cloud Service.
Yes, you can run queries and spool CSV to local files, and that might be faster than using the wizard, but what you're describing isn't really what ORACLE SQL Developer is built for.

Related

How can I transfer data from AWS RDS PostgreSQL db instance to mysql db which is on a different server?

Till now I have been transferring the data manually by just exporting the data and then importing it to MySQL DB but now I have to automate the whole process.
I want to generate CSV and FTP these CSV files to MySQL server.
pgAdmin let me download the file through windows but when export via the "COPY" command; however, I get the error that I need to be a superuser and that I should use "\copy" instead.
And I cannot access the Operating system for the Postgre server.

Importing .bak from MSSQL into MySQL database

My companies site uses a mysql database. One of our clients just trying to take advantage of our API is only able to give us the data in the form of a MSSQL .bak file
I have been trying to import the file using the migration tool built inot mysql workbench but have no luck.
On top of that I am trying to see if this can be done in powershell as I would like to automate this process in the future.
Any suggestions or help would be appreciated
You cannot. MS SQL Server backups are proprietary to MS SQL Server and cannot be used with any other RDBMS. You will need to restore this backup to SQL Server, then use an additional tool to transfer the data from SQL Server into MySQL.
Can you do that second portion through PowerShell? Probably. Though SSIS would probably be a better method.

Can I import my MySQL data into Google Spanner?

I tried to import a mysqldump generated .sql file but Google spanner didn't accept the syntax, which makes sense.
With that, we're trying to migrate our data, which is in a MySQL data, into Google cloud spanner. Is this possible?
Here's a project that I used to upload my (test) data to Cloud Spanner from a PostgreSQL database: https://github.com/olavloite/spanner-jdbc-converter. It uses standard JDBC functionality, so it should be easy to adapt it to work with MySQL as well.
Another possibility, if the database you're trying to upload is not very big, would be to use a standard database tool that allows you to copy data from JDBC compliant database to another. DBeaver supports this feature. Have a look here for how to set up DBeaver with Google Cloud Spanner: http://www.googlecloudspanner.com/2017/10/using-standard-database-tools-with.html
You can use Harbourbridge to migrate both schema and data to Cloud Spanner from a MySQL source.

Import data from CSV file to Amazon Web Services RDS MySQL database

I have created a Relational Database (MySQL) hosted on Amazon Web Services. What I would like to do next is, import the data in my local CSV files into this database. I would really appreciate if someone provides me an outline on how to go about it.Thanks!
This is easiest and most hands-off by using MySQL command line. For large loads, consider spinning up a new EC2 instance, installing MySQL CL tools, and transferring your file to that machine. Then, after connecting to your database via CL, you'd do something like:
mysql> LOAD DATA LOCAL INFILE 'C:/upload.csv' INTO TABLE myTable;
Also options to match your file's details and ignore header (plenty more in the docs)
mysql> LOAD DATA LOCAL INFILE 'C:/upload.csv' INTO TABLE myTable FIELDS TERMINATED BY ','
ENCLOSED BY '"' IGNORE 1 LINES;
If you're hesitant to use CL, download MySQL Workbench. It connects no prob to AWS RDS.
Closing thoughts:
MySQL LOAD DATA Docs
AWS' Aurora RDS is MySQL-compatible so command works there too
"LOCAL" flag actually transfers the file from your client machine (where you're running the command) to the DB server. Without LOCAL, the file must be on the DB server (not possible to transfer it there in advance with RDS)
Works great on huge files too! Just sent a 8.2GB file via this method (260 million rows). Took just over 10 hours from a t2-medium EC2 to db.t2.small Aurora
Not a solution if you need to watch out for unique keys or read the CSV row-by-row and change the data before inserting/updating
I did some digging and found this official AWS documentation on how to import data from any source to MySQL hosted on RDS.
It is a very detailed step by step guide and icludes an explanation on how to import CSV files.
http://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/MySQL.Procedural.Importing.AnySource.html
Basically, each table must have its own file. Data for multiple tables cannot be combined in the same file. Give each file the same name as the table it corresponds to. The file extension can be anything you like. For example, if the table name is "sales", the file name could be "sales.csv" or "sales.txt", but not "sales_01.csv".
Whenever possible, order the data by the primary key of the table being loaded. This drastically improves load times and minimizes disk storage requirements.
There is another option to import data to MySQL database, you can use an external tool Alooma that can do the data import for you in real time.
Depending on how large is your file, but if it is under 1GB I found that DataGrip imports smaller files without any issues: https://www.jetbrains.com/datagrip/
You get nice mapping tool and graphical IDE to play around. DataGrip is available as a trial for 30 days free.
I am experiencing myself RDS connection dropouts with bigger files like > 2GB. Not sure if it is about the DataGrip or AWS side.
I think your best bet would be to develop a script in your language of choice to connect to the database and import it.
If your database is internet accessible then you can run that script locally. If it is in a private subnet then you can either run that script on an EC2 instance with access to the private subnet or on lambda connected to your VPC. You should really only use lambda if you expect runtime to be less than 5 minutes or so.
Edit: Note that lambda only supports a handful of languages
AWS Lambda supports code written in Node.js (JavaScript), Python, Java
(Java 8 compatible), and C# (.NET Core).

How to import a MS SQL server table into mysql?

I need to export about 300,000 rows from a table on MS SQL server and import into mysql on a different server (non windows).
There is some text stored in some fields and commas in the text will mess up the format if I export into txt format.
I can't install any software on the server.
You may have several options:
use SSIS or DTS Wizard on another host to interconnect MSSQL and MySQL
write your own small app to move the data
script the MSSQL DB's DATA into script file with SSMS' scripting features, adopt the script manually to mysql and run it
I'm pretty sure you can connect to MySQL from SSIS. I'd go down that route.