How to delete data from database after select query excuted successfully - mysql

My Question is that when data is select successfully then delete from MySQL db.When Select Query is fails due to large data then it can not delete data using shell script because in that case select data are also with image store in system then it can also move to another location and make it to tar file.

Related

How to execute a MySQL delete query in Azure Data Factory?

I would like to delete the MySQL DB records using ADF.
I have created the pipeline in ADF and I am copying the data from a MySQL Database to Storage Account using copy activity in ADF, once that's completed, I would like to delete those copied records from MySQL database.
I am not able to find out any activity which can allow to delete the records from MySQL Database.
Script Activity doesn't allow MySQL linked Service only SQL DB allowed.
Please need your suggestion to complete it.
You can use lookup activity which supports both SQL and MySQL with query after copy activity to delete the records after copy.
After the copy activity join it with the look up and give your source dataset.
Select query and give the truncate query to delete the records in the table.
truncate table [dbo].[output];
I have added the select script above only to avoid the lookup error which gives error if the query didn’t give any data. However, it will truncate the records in the table even after giving error.
If you want to delete the total table, you can give drop query.
drop table <tablename>;
Data copied to blob storage after copy activity:
Table after copy activity:
Here I did it using azure SQL database. You can do the same with Azure MySQL Database as lookup supports both.
You need to create a stored procedure in your Database and add the stored procedure activity as a final step in your Azure Data Factory pipeline. If you'd like to truncate the whole data once the copy is finished, here's how you would create your Stored Procedure:
GO
CREATE PROCEDURE SP_Truncate
AS
BEGIN
TRUNCATE TABLE mytable
END
Once you've created this, add a stored procedure activity as a last step in your Azure Data Factory. It'll delete the copied data. Read a bit more about this in the documentation; you can also add parameters in your stored procedure, which you can refer to using lookup activity. Let me know if you need more help.

MySql database connectivity

I've written code to insert data in mysql database successfully compiled it on cmd and my database along with table is also ready.
I want to know the further steps so that can understand whether my record has inserted or not in mysql
If your code is doing as expected, and there is no error, then the values should be inserted.
Steps to check whether everything is working fine or not:
Go to your database and run this query:
SELECT * FROM tableName
Run the program and insert some data.
Again go to your database and run this query:
SELECT * FROM tableName
If there are changes appeared according to the data inserted, it means your program is working fine.

Apache Drill CTAS - Skip invalid records

I am trying to load large number of files to apache drill database through CSV files via CTAS command.
The query fails completely even if one record in a million is having a wrong value for timestamp.
Is there anyway we can skip invalid records and load the rest in apache drill CTAS command?
Query Used:
create table RAW_LOADER as select time_report, record_count, TEXT_COL
from dfs./path/to/csv/;

Importing data from Excel to MySQL database using SQLyog

Is there a way to automate importing of data from Excel/csv to MySQL database using SQLyog so that it automatically updates every time?
use Database -> Import -> Import External Data
Automatically each time what happens though ? It may be easier using LOAD DATA mysql command on a cronjob.
For an "update" by CSV check out
http://www.softwareprojects.com/resources/programming/t-how-to-use-mysql-fast-load-data-for-updates-1753.html
Step 1: Use LOAD DATA to import the entire input file into a temporary table and update primary key id's for any existing records we already have
Step 2: Use SELECT INTO OUTFILE to write all records we couldn't find an id for (new records) into a temporary text file
Step 3: Use LOAD DATA to add all new records (from step 2) into the target table
Step 4: Use a single UPDATE query, to update all records that match an existing primary key
Sqlyog is a standalone application and can't automate the importing process.
You can create a cronjob for LOAD DATA INFILE command to your server.

Copying MySQL table data to another table

I want to copy the data from one MySQL table to another table. The source table contains 30 million records. the SQL connection gets lost when I tried to copy the data using the SQL query
INSERT table2 SELECT * FROM table1
Is there any external tool avaliable to do this job from the shell
Thanks
Sree
The mysql command line tool should be able to handle this just fine.