Importing very large MySQL Databases

Status
Not open for further replies.

mcdonna

New Member
Messages
6
Reaction score
0
Points
0
I have a very large MySQL DB that I need to import, it's circa 350MB uncompressed, it zips down to about 60MB.

Is there any way I can import this with a free account?

It's to be used in conjunction with phpBB, I know the built in "restore DB" function will not work on a forum this large.

The phpmyadmin method appears to have a filesize limitation slightly too small to accept the zipped version.

That leaves the "backups" method, is that likely to work?

Ideally I would FTP the backup files and SSH in to import the data, but there's no SSH access without paying I believe. I don't want to put any cash down as I'm still at a feasability stage to see if my forum can be migrated or not.

Due to the size of the DB splitting it up into smaller sections via notepad will be problematic and prone to error.

My cPanel name is mcdonna.
 

bugfinder

Retired
Messages
2,260
Reaction score
0
Points
0
The biggest problem I see is that the maximum execution time.. You could split it by table, and if you have specifically large tables you could batch it up, and do a refreshing page to go through it bit by bit...
 

mcdonna

New Member
Messages
6
Reaction score
0
Points
0
The biggest problem I see is that the maximum execution time.. You could split it by table, and if you have specifically large tables you could batch it up, and do a refreshing page to go through it bit by bit...

I'm currently trying using the "backups" method, it hasn't definitely failed but it appears to be hung, the status of the upload page has been on "waiting for: http://.../dosqlupload.html" and the page is blank thus far.

How long would you leave it before considering it to be "dead"? Baring in mind it was 56MB zipped.

I'll leave it overnight (on work pc and about to go home) and see what happens, maybe it's taking a long time to do the server side processing (I upped the zipped version).

Unfortunately I doubt splitting the DB into individual tables will help, the majority of the content will be contained within two tables.

Sounds like:

"you could batch it up, and do a refreshing page to go through it bit by bit"

May be the answer, but I'm not sure what you mean by that :-(
 
Last edited:

bugfinder

Retired
Messages
2,260
Reaction score
0
Points
0
Well

the mysql backups are in a format which is only text, by splitting the records into say 100 record batches using something like php, you could use a page that splits out the records and inserts them into your database, goes back, calls itself to go to the next set, and repeats till it has gotten to the end.
 

mcdonna

New Member
Messages
6
Reaction score
0
Points
0
Well

the mysql backups are in a format which is only text, by splitting the records into say 100 record batches using something like php, you could use a page that splits out the records and inserts them into your database, goes back, calls itself to go to the next set, and repeats till it has gotten to the end.

That makes sense. Is there anything out there already written to serve this purpose? Something I could just drop into my "forum" folder and access via web? I'm guessing it's not an uncommon problem.
 

mcdonna

New Member
Messages
6
Reaction score
0
Points
0
Not that I know of, but then, I havent looked :p

Well thanks for making me think along those lines anyway :)

For future people who may stumble across this post, I did a search and found a neat little tool called BIGDUMP by Alexey Ozerov.

It splits it up into chunks of 3000 lines (by default, it's configurable) and executes them one after the other. Got the job done no problems and after a few teething issues, my forum is up and running (so much for the previous admin saying it would be a nightmare to move servers :lol:).

Best of all with BigDump, it gives you continual process updates, unlike any of the other DB uploaders I've seen.
 
Status
Not open for further replies.
Top