Migrate sample of a MySql table to PostgreSQL using pgloadel tool - mysql

Mission: Migrate data from MySQL to PostgreSQL using pgloader. Pgloader is installed using this guide
Context: I have installed pgloader in the machine where MySQL server exists, and I have also a VM Linux that runs postgres.
The MySQL server is accessed from MySQL Workbench. Apparently, the MySQL user does not have privileges to create tables in that MySQL server.
So what I would like to do is to execute the command below:
pgloader mysql://mysql_username:password#mysql_server_ip_/source_database_name postgresql://postgresql_role_name:password#postgresql_server_ip/target_database_name
and load a sample of the MySQL table to Postgres table like a SQL query:
select *
from source_database_name.table1
where year = 2021
limit 1000;
Is that feasible with pgloader or should I migrate the whole MySQL table which probably is 100gb.
Temporary approach
Load the data using a csv file.
Basically, I executed the SQL query to yield 1000 sample rows from a MySQL table.
Then on the same directory with the csv file I have created a csv_file.load file. The file includes the following commands
LOAD CSV
FROM './csv_file_with_data.csv'
INTO postgresql://<my_user>:<my_pass>#<public_ip>:5432/<db_name>?table1_1000
WITH truncate,
skip header = 1,
fields optionally enclosed by '"',
fields escaped by double-quote,
fields terminated by ','
SET client_encoding to 'latin1',
standard_conforming_strings to 'on'
BEFORE LOAD DO
$$ drop table if exists public.table1_1000; $$,
$$ CREATE TABLE IF NOT EXISTS public.table1_1000(
a varchar(4) DEFAULT NULL,
b varchar(20) DEFAULT NULL,
c varchar(15) DEFAULT NULL,
d int DEFAULT NULL,
...
);
$$;
Executed the file like pgloader csv_file.load

Related

Convert .mysql extension to xlsx

My supervisor was given a backup file from our company's cloud mysql database (adminstered by a 3rd party)
The file has a .mysql extension. I can view some of the data using Notepad++ so I know it contains valid data. In my research I discovered this a deprecated extension. Due to some reporting requirements, I was asked to move this data into Excel. I know enough about databases of the five of us in the shop to be considered the "expert" (a scary thought)
Research I've done leads me to believe I would be required to do a LAMP install to convert the mysql file to PDO which I think I can then convert to Excel. That seems like overkill to me.
Is there a more direct route? Load a legacy version of MySQL and hope I can do some conversion in the workbench? The file is a little over 500MB.
I typically develop industrial controls in Python or C#.
-- MySQL dump 10.13 Distrib 5.7.33, for Linux (x86_64)
--
-- Host: localhost Database: company_name
-- ------------------------------------------------------
-- Server version 5.7.33-0ubuntu0.18.04.1
DROP TABLE IF EXISTS `ACTIVEMQ_MSGS`;
/*!40101 SET #saved_cs_client = ##character_set_client */;
.
.
.
/*!40101 SET character_set_client = utf8 */;
CREATE TABLE `ACTIVEMQ_MSGS` (
`ID` bigint(20) NOT NULL,
`CONTAINER` varchar(250) DEFAULT NULL,
`MSGID_PROD` varchar(250) DEFAULT NULL,
`MSGID_SEQ` bigint(20) DEFAULT NULL,
`EXPIRATION` bigint(20) DEFAULT NULL,
`MSG` longblob,
`PRIORITY` bigint(20) DEFAULT NULL,
PRIMARY KEY (`ID`),
KEY `ACTIVEMQ_MSGS_MIDX` (`MSGID_PROD`,`MSGID_SEQ`),
KEY `ACTIVEMQ_MSGS_CIDX` (`CONTAINER`),
KEY `ACTIVEMQ_MSGS_EIDX` (`EXPIRATION`),
KEY `ACTIVEMQ_MSGS_PIDX` (`PRIORITY`)
) ENGINE=InnoDB DEFAULT CHARSET=latin1;
.
.
.
LOCK TABLES `rh_blobs` WRITE;
/*!40000 ALTER TABLE `rh_blobs` DISABLE KEYS */;
INSERT INTO `rh_blobs` VALUES (data....)
INSERT INTO `rh_blobs` VALUES (data....)
Thanks to all for recommendations. Using mysql command line helped me solve the problem.
Install mysql on my desktop (Windows)
start mysql with the following. Entered root password at the prompt
cd c:\program files\mysql\mysql server 5.7\bin
mysql -u root -p
Create and restored the backup/archive file to a new database
create database company-name-report
use company-name-report
source c:\users\user_name\Downloads\company_name.mysql
Following #O. Jones advice, downloaded HeidiSQL and was able to view the data.
Should be a simple task now to export to CSV for use with Excel
Assuming your file contains the text of a bunch of SQL statements, here's what you need to do.
Stand up a MySQL server temporarily.
Run the SQL statements in your file, one after the other in order, against that server.
Issue a SQL statement to write out a comma-separated-value (.csv) file for each table that's populated by that file of yours.
Import those .csv files into Excel.
Get rid of the MySQL server.
The version of the MySQL server you use is almost certainly irrelevant.
MySQL Workbench or the HeidiSQL client program, among many others, give you the tools you need for steps 2-4.
In MySQL Workbench, right-click the table to export and choose Table Data Export Wizard. In the wizard, choose csv export. Choose comma for the field separator, CR for the line separator. Enclose strings in ", the double quote character. And for null and NULL, choose NO.
Several cloud providers offer quick and easy ways to stand up MySQL servers. For example, Digital Ocean offers MySQL servers for pennies an hour. And they charge by the hour. If this little project costs you as much as five US dollars, that will be a surprise.

while doing alter on a table to add column we are getting timing out error via flyway

we are using flyway to do schema changes which is on MARIA DATABASE. while doing alter on a table we are getting timing out error via flyway. Alter table is just adding a column. this script is executing via flyway on spring boot application.
classification table has 900k records.
this is the script:
ALTER TABLE wallet.wallet_debit
ADD COLUMN tin_code varchar(45) ;
When I ran alter table to add column via MySQL workbench , it ran in few seconds successfully.
ERROR is only coming via flyway while executing that script

Importing Data into MySQL Database with SOURCE command

I have an existing MySQL database that I want to refresh with data from MSAccess database. I already have the SQL files created from the Access database with all of the insert statements. There are 3 SQL files, the largest of which is 8 MB.
The database is on an AWS server. In the past I've imported data using Sequel Pro from my Mac. This is very slow and subject to session failure.
Now I've figured out how to create the SQL files on my Windows VM and FTP them directly to the AWS server. My intent is to have a stored procedure truncate all tables and SOURCE the SQL files:
SOURCE /home/me/file1.sql ;
SOURCE /home/me/file2.sql ;
etc...
The stored procedure would also do any prep work on the tables and any post-import things needed like fixing foreign keys, etc.
The first problem is that this command doesn't work and causes syntax error:
set autocommit=0 ; source /home/me/CBD.sql ; commit ;
"source" is squiggly underlined and it says "missing colon". This happens whether or not I use the auto commit stuff.
Any ideas how I could do this?
Thanks...

Issue in Import database .sql file to WAMP server (MySQL Version 5.5.24)

I had take my database backup in WAMP server (MySQL Version 5.0.10)
This database is use PK-FK realtionship between multiple tables each others.
Now when i Import this backup.sql file in my new wamp server phpmyadmin (i.e 5.0.10 => 5.5.24); it shows dependency error; that Unknown column 'min_investment_size' in 'field list' when first Dumping data for table company.
I know here company table have one FK relationship with investment table; so before dumping company table how to dump it's predecessor table on which company is relies?
Also many other tables and procedure have same depedency issue. E.g. user need company.id; so even before dumping user table company table must be dumped. Some screenshots here i want to share:
So how to par from this situtation.
Note: I have already tried to migrate with
RedGate MySQL Comparator.
MySQL workbench
but the error during migration/synchronization is same as above.
1) the table structure in db does not have the column
2) the stored proc already exists and your script doesn't do an if before it to create it only if it does not exist
IF NOT EXISTS(SELECT 1 FROM mysql.proc p WHERE db = 'db_name' AND name = 'stored_proc_name') THEN
{your stored proc block here}
END IF;
or drop if exist on spname then create it

Alternative to openquery delete for MySQL

I am running Microsoft SQL Server 2012 and am attempting to run an openquery delete command into a MySQL database:
delete openquery(MyLinkedServer, 'select * from table_to_delete_from');
This works, however is utterly, painfully slow to the point that it is unviable. The dataset involved is far too large, and doing the above requires that all of the MySQL data to be deleted must be fetched over VPN to the MSSQL server.
When running this command directly from the MySQL server, it is observed to be over 5x faster, which is viable.
How can I invoke a delete command from MSSQL over to the MySQL linked server without having to copy the datasets? Perhaps running a stored procedure of sorts on the MySQL side? Does that work with openquery?
Try this:
exec ('delete from table_to_delete_from') at MyLinkedServer
another case for SSIS, whenever i need to communicate to MYSQL. and insist to have truncate process on MYSQL table. i provide the procedure like below, :
CREATE PROCEDURE [dbo].[Usp_DeleteDynamicTable] #sourceTable nvarchar(max)
AS
BEGIN
DECLARE #SQL nvarchar(max)
SET #SQL='truncate table '+ #sourceTable
exec (#SQL) at MYSQL --MYSQL is my linked server names
END