Re: For all you php / sql knowledge folks
Posted by Leperous on
Fri Jan 21st 2005 at 1:37am
Leperous
Creator of SnarkPit!
member
3382 posts
1635 snarkmarks
Registered:
Aug 21st 2001
Occupation: Lazy student
Location: UK
3.8mb?! Hah, this site's is around 80mb.
The solution is to chop up your SQL query and feed it bit by bit- cut and paste large parts of the query at a time, instead of trying the whole thing at once.
Or of course you could upload the script and make a PHP script that runs it from localhost...
Re: For all you php / sql knowledge folks
Posted by xconspirisist on
Fri Jan 21st 2005 at 2:36pm
307 posts
81 snarkmarks
Registered:
Feb 26th 2003
Occupation: Student
Location: UK
Ive been trying to hack apart some of the phpbb code, it is capable of processing massive sql backup files.
3.8 megs, in 1 month, with only 14 users, and only forums. :smile:
Re: For all you php / sql knowledge folks
Posted by fraggard on
Sat Jan 22nd 2005 at 4:42pm
1110 posts
220 snarkmarks
Registered:
Jul 8th 2002
Occupation: Student
Location: Bangalore, India
xcon, if you have the database mirrored on a local sqlserver on your
machine, I think the simplest thing to do will be to dump your data one
table at a time. Sequential filenames would be best. If you're running
Linux (and I assume you are, based on what you've posted before), a
simple perl/awk script should be able to automatically dump the tables
into sequential files.
Then run the dumps also one file at a time. A PHP script should be able to do it automatically from a list of files.