Complete solution to migrate a large database sql file in MySql

php and mysql

php and mysqlDuring my recent project, i was in a situation where I had to move a large mysql database from a testing server to live server. I got stuck almost on every phase of this whole situation. Firstly it created problems in exporting the database from testing server (phpmyadmin has many issues with large sql file handling, I guess) and when I was importing on live server it again threw me many problems.

The purpose of this blog is to tell others how to overcome all these steps and how to move a large database using different methods. This method is not the only method, but was my personal choice. Many experienced users have used other alternative solutions for doing this stuff.

My situation is that I have the database userid and password (for live server) which does not have superuser privileges. So here we go.

Export Database

My personal experience in using phpmyadmin for exporting large database is not good, so I suggest using php scripts available on internet to import large sql files eg. phpminiadmin or any other. Also you can write your own script. Simply put the script on your testing server via ftp and browse the url of this script, put database credentials and export it

Other options are using command line or SSH client for Windows which is explained below

Let us assume username is abc and password is abc123, the testing server from which you have to export database is example.com and database name is test_db1, therefore command to import it from your local Windows machine is.

mysqldump –u abc –p –h example.com test_db1 >> c:\dbsqlfile.sql

Enter the password (in this case abc123); of course you can use any destination path instead of c:\dbsqlfile.sql on your local window machine.

If you are able to SSH to your testing server (example.com) then put the following command in SSH client for window like putty or winscp

mysqldump –u abc –p test_db > httpdocs/dbsqlfile.sql

Enter password and remember that this command will save the sql file on the testing server (example.com) to the defined path in above command.  Here you have to be careful while choosing destination path for saving the sql file because the folder may have permission issues. Now using secure file transfer protocol you can download this sql file on your system

Import Database

If you use phpmyadmin installed on the live server to import the large sql file chances are you may get ‘maximum time execution’ error even if your file is smaller than the maximum allowed size to upload on phpmyadmin (that was the problem with me). So here is what you should do (some of you would that think changing the maximum execution time setting will help but in my case it didn’t).

Suppose your live server is liveexample.com, database name is test_db2 with username xyz and password xyz123. Remember this user has only permission to access database test_db2.

On your local Windows machine open command line tool and enter the following command

mysql –u xyz –p –h liveeaxample.com test_db2 < c:\dbsqlfile.sql

Enter your password (xyz123). Remember to use the correct source path.

If you have uploaded your database sql file on the live server then you can use the free SFTP, FTP and SCP client for Windows called Putty or WinSCP and enter the following command

mysql -u xyz –p test_db2 < httpdocs/dbsqlfile.file

Enter your password (xyz123), again remember the correct path of the database sql file (it is the path where you have uploaded sql file on live server). You can also use source command after login in mysql for the same.

One important thing I would like to mention that the process of exporting database from testing server on your local computer and then uploading it on live server for importing it on the database, can be done in many other efficient way which is a lot faster than this process, if you know tar, scp or rsync  unix commands to transfer file over a network you can directly transfer your large sql file from one server to another  without any problem. I myself used tar command to transfer large file (though not in this case). Even if you are not very experienced user you can Google the command and shoot it on any SSH client program like putty

Coming back on main topic, there are chances that after executing the above commands for importing the database you may get Error “Got a packet bigger than ‘max_allowed_packet’ bytes”. This is the major problem which I faced during all these steps and it also encouraged me to write this blog.

For this you have to change the max_allowed_packet value for your local system mysql daemon as well as live server mysql daemon.

If you have superuser privileges for your database user xyz than you can do the steps below without any error.

SSH to the live server, login to mysql console connected to the live server and run these commands

set global net_buffer_length=1000000;

set global max_allowed_packet=1000000000;

Try to use large value for packet size.

To check if they’re set, use

select @@max_allowed_packet;

Now open another SSH terminal and shoot the following command

mysql –max_allowed_packet=100M -u xyz -p test_db2  < httpdocs/dbsqlfile.file

Enter the password (xyz123) and you are done.

But as I mentioned above that I had no superuser privilege for my database user so running the above mentioned command still failed with error message “access denied; you need the SUPER privilege for this operation- …..”. So what to do now? Either you can ask the hosting company to give you the password of the root user (which has all the privileges) or take permission to change mysql root password (which will not be allowed in shared hosting unless you have a dedicated server) and do the importing stuff as mentioned above. Another thing you can do is you can ask the hosting company to grant all privileges to database user xyz.

I assume that you have got the permission to reset the root password, so here is the way you can reset the root password (This technique can also be used to reset the root password of mysql server in any other case)

Log in as root user (remember only root user has all the privileges to execute commands, this root user is not same as the database root user, it is root user of live server) in your live server using SSH client

Stop the mysql server:

service mysqld  stop  or /etc/rc.d/init.d/mysql stop

start mysql server without password:

mysqld_safe –skip-grant-tables &

connect to mysql server:

mysql –u root

Now start the process of setting password for root user:

mysql> use mysql;

Database called mysql has a table called user which contains all the users information including root user, sometime there is no root user instead you can find user with name admin which has all the privileges, so in this case you have to reset the password of admin. To list all the users in user table you can use following command.

mysql> select User form user;

I myself found admin instead of root. Now use update command to reset the password of the root user

mysql> update user set password=PASSWORD(“NEW PASSWORD”) where User=’root’;

where NEW PASSWORD is your new password which you want to set for the root user (remember you may have to change root with admin in above command if you found user admin instead of root). Now to retain all the privileges run

mysql> flush privileges;

mysql> quit;

After quitting from mysql client again stop msyql server:

service mysqld stop  or /etc/rc.d/init.d/mysql stop

Again start mysql server and try your new password

service mysql start or  /etc/rc.d/init.d/mysql start

mysql –u root –p or mysql –u admin –p

Enter your new password and you will login in mysql. You can set max_allowed_packet and net_buffer_length and able to import database using large sql file as mentioned above without any error.

This is all you need to export and import a large database, hope you enjoyed the blog.

I would like to hear your suggestions and comments

Be the first to comment

Leave a Reply

Your email address will not be published.


*