Database mysql process overloading after cloning - mysql

I clone my Wordpress site. I hosted them both in shared hosting. But now my new website always make overloading the sharedhosting server due to database mysql process.
Here are some screenshots:
https://prnt.sc/26lotyb
https://prnt.sc/26loucm
This is what I did to clone my site:
Export database from phpmyadmin
Compress and download files in public_html
Upload, extract files in public_html, import database, connect them.
Run SQL command to replace domain name
Use the plugin Better Search Replace to replace needed things.

Related

Restoring wordpress site using .zip and .sql onto AWS

What is the best was to restore a website from a backup when all I have is a .zip file and a .sql file of the wordpress site migrating onto aws. So far I tried installing wordpress with a EC2 instance of wordpress and I was going to try and restore from the dashboard using a wordpress plugin. But no plugin I found can upload a .sql database like I thought. Which doesn't make much sense now that I really think about it. I am unsure if should I install a Microsoft SQL Server DB Instance in RDS or MySyl for the wordpress install? Please help so confused...
You can use that method:
Extract the wp-content folder and upload it.
Log onto your ec2 and use the mysql command line to import the database.
Update the siteurl and home in wp_options.

How Do I Restore WP site from unzipped files and SQL dump?

I have a site I am supposed to migrate. The previous developer did not give me access to the Wordpress backend or a Wordpress back up file. I do, however, have access to FTP for the site. I've downloaded all of the site files including Wordpress, plugins, the theme, and I have a SQL dump file of the database. How can I restore the site (at another location) with these elements?
Assuming you have the previous site's root in tact, meaning you extracted ALL the files, and not just wp-content (if you only did this, you should just download everything from the previous root).
1) Upload zipped file to new root directory
2) Unzip the file
3) Create new database and database user
4) Import SQL dump to new database
5) In wp-config file, adjust settings to point to new database
6) In database, change primary site URL to the new site domain (may be multiple instances so do search/replace)
7) In database, create a WP user so you can log in to Wordpress backend
8) Remove original zip file from root
Assuming you have access to the database you can use a MD5 hash generator to change the password of the admin user then use a duplicator package plugin to copy and migrate the site wherever you need to.

How to put MySQL database in project folder

I'm using Node.JS express and MySQL. Also I have XAMPP installed. Database is in folder C:\xampp\mysql\data\users now, but I want It to be in C:\...my-project\users so I can push it to github and other people could get it with all the other files. How can I move it to another folder and continue working with it on localhost?
Best way is to mysqldump your database.
Then you can mysqlimport your .sql file or simply execute in a sql query.
Also, working on plaintext sql template allows you to use version control.

How to import table 180Mb without getting max_questions error

I have to import a 180Mb table on a shared (which means trouble I know) host and I'm getting the max_questions error. The limit is 75000 and they are not willing to higher the limit for even an hour. The number of queries as about 2000000.
I have been using bigdump to upload the gz file, but the limit is killing me.
Is there a way to split the table and upload it bit by bit or merge it somehow? I have been searching online, but solutions are mostly for people who can access the database settings.
I am assuming that you have phpmyadmin installed in your website directory,
other wise download latest phpmyadmin version from official website.
and unzip it in your website directory
If that's the case then perform below steps.
Access your website directory via sftp or ftp.
If you can not access your website directory via sftp or ftp then you can setup cron jobs for fetching files.
Find the config.inc.php file located in the phpmyadmin directory. In my case it is located here:
\public_html\phpmyadmin3.2.0.1\config.inc.php
Find the line with $cfg[‘UploadDir’] on it and update it to:
$cfg['UploadDir'] = 'upload';
Create a directory called ‘upload’ within the phpmyadmin directory.
**\public_html\phpmyadmin3.2.0.1\upload**
Then place the large sql file that you are trying to import into the new upload directory.
Now when you go onto the db import page within phpmyadmin console
you will notice a drop down present that wasn’t there before
it contains all of the sql files in the upload directory that you have just created.
You can now select this and begin the import.
Reference : http://daipratt.co.uk/importing-large-files-into-mysql-with-phpmyadmin/comment-page-4/

Full backup of GoDaddy site via command-line script

Is there a simple way to do an automated backup of an entire website on a host like GoDaddy via the command-line?
So far, I know I need to backup all the files in my home directory recursively. I could possibly automated SFTP to connect and issue a get -R * command to get the full file dump, or just use SCP.
The other half of the puzzle is getting all of the tables available, mostly WordPress tables. My guess is that maybe there's a command-line command I could issue which dumps the database contents to a flat file, which I could then also pull via SFTP. If such a command exists, my plan is to use a combination of Telnet and EXPECT scripts to login to the GoDaddy site, issue some commands, then disconnect back to my local shell.
The end result should be that I have a folder with all of my server content in it, plus the flat file backup of the SQL database from the server. I know there are WordPress backup plugins, but they tend to provide a slew of ZIP files, when all I want is the raw data directly so I can put it in my private SVN server for backup and versioning.
So my question: how do I extract all of the databases on my GoDaddy server via the command-line to a file?
Thank you.
In the end, I found a working solution.
First, I used 2 separate expect scripts.
Telnet into the server, delete old backups, use mysqldump to extract all tables to a flat file via mysqldump -u db_owner -p --all-databases > output.sql, and create a massive tarball of everything. Logout.
Use SCP to pull the newly created tarball, extract it to a local SVN controlled working copy folder.
Use a second expect script to login to the server and delete the backup. Logout.
From there, I just manually svn add and svn commit as needed.