Posting in the Magento forums has been disabled pending the implementation of a new and improved forum solution which should better serve the community.

For new questions please post at magento.stackexchange.com, the community-run support site for the Magento community. We will be providing updates on the new forum solution soon. For questions or concerns please email community@magento.com.

Magento Forum

magento backup and disk space usage problem
 
mtbmonkey
Jr. Member
 
Total Posts:  28
Joined:  2011-05-20
 

I tried to make a magento backup, It started generated a massive file in var/backups, admin crashed so I deleted the file while it was still getting bigger.

my disk space usage is now over 10gb, it was 2.5 before the start.

There was nothing in the var/backups directory, then after a while a big file popped up, I deleted it, and my disk space went down, then it started growing again, its over 12gb now and I can;t see why

Have deleted cache, there\’s nothing sizeable in var/backups, or anyother var directories,

I think the problem related to this https://www.magentocommerce.com/bug-tracking/issue/?issue=12180

however I was able to backup from magento before without the files being that big.

Can anyone shed light on the large disk space usage, and how I can put this right

 
Magento Community Magento Community
Magento Community
Magento Community
 
MagenX
Enthusiast
 
Total Posts:  791
Joined:  2008-05-26
Dublin
 

never use magento backup to backup itself,
create a simple bash script, and do a backup of magento files and database seamlessly…
create cron jobs something like this:

# database backup
mysqldump -u {database_user} -p{database_pass} --single-transaction --routines --triggers --all-databases gzip > /home/backup/db/database_`date '+\%a-\%d-\%m-\%Y'`.sql.gz
# site backup
tar -cvpzf  /home/backup/site/my_site_`date '+\%a-\%d-\%m-\%Y'`.tar.gz  /var/www/my_site/

then you go here and get yourself a Dropbox Uploader script https://github.com/andreafabrizi/Dropbox-Uploader

[root@magen ~]# ./dropbox_uploader.sh

 
This is the first time you run this script.
 
Please open this URL from your Browser, and access using your account:

 -> 
https://www2.dropbox.com/developers/apps

 
If you haven't already done, click "Create an App" and fill in the
 form with the following data:

  App name: MyUploader2834211417
  Description: What do you want...
  Access level: App folder or Full Dropbox

 Now, click on the "Create" button.

 When your new App is successfully created, please type the
 App Key, App Secret and the Access level:

 # App key: xxxxxxxx
 # App secret: xxxxxxxxx
 # Access level you have chosen, App folder or Full Dropbox [a/f]: a

 > App key is xxxxxxxxx, App secret is xxxxxxxxxxx and Access level is App Folder, it'
s ok[y/n] y

 
Token request... OK

 Please visit this URL from your Browser
, and allow Dropbox Uploader
 to access your DropBox account
:

 --> 
https://www2.dropbox.com/1/oauth/authorize?oauth_token=xxxxxxxxxxxxxx

Press enter when done...

 > 
Access Token request... OK

 Setup completed
!

[root@magen ~]# ./dropbox_uploader.sh upload /home/backup/db/*
 
Uploading /home/backup/db/database_Sat-09-02-2013.sql.gz  to /home/backup/db/database_Sat-09-02-2013.sql.gz 
######################################################################## 100.0%
 
DONE

then you assemble this stuff into complete automatic cron job

# database backup
mysqldump -u {database_user} -p{database_pass} --single-transaction --routines --triggers --all-databases gzip > /home/backup/db/database_`date '+\%a-\%d-\%m-\%Y'`.sql.gz
# site backup
tar -cvpzf  /home/backup/site/my_site_`date '+\%a-\%d-\%m-\%Y'`.tar.gz  /var/www/my_site/
/
root/dropbox_uploader.sh upload /home/backup/db/*
/root/dropbox_uploader.sh upload /home/backup/site/*
find  /home/backup/db/database_* -type f -exec rm {} \;
find  /home/backup/site/my_site_*  -type f -exec rm {} \;

and go buy a 6-pack of beer and relax.

ps. check this code before on local server, then go for production, weekend, you know…

 
Magento Community Magento Community
Magento Community
Magento Community
 
chiefair
Mentor
 
Avatar
Total Posts:  1848
Joined:  2009-06-04
 

Avoid Magento’s backup routines like the plague.

You’re going through a PHP process attached to a GUI frontend.

It has an amazing amount of overhead. Never is so much effort in memory and clockcycles expended for so little return.

Go for command line shell scripts. MySQLDump is very fast, Tar creates a compressed full file system backup. Expect on an image intensive (3-15 images per product) website with 10,000 products to have backups in the 1.5-2.1 GB range.

MagenX has a pretty nice package there to execute from the SSH command line and transfer over to a DropBox.

Mine’s a little more complex with an external config file to tailor it to each server it’s put on and is to be extended to shoot things over to an S3 bucket.

The one thing I can recommend for improvement to do is to have the tar command exclude the var/backup/*, var/cache/* and media/catalog/product/cache/* directories and subdirectories. Anything in backup is a waste, cache will cause grief if the site is being restored to another server and the image cache is huge and will be recreated as pages on the site are accessed.

tar -cvpzf  /home/backup/site/my_site_`date '+\%a-\%d-\%m-\%Y'`.tar.gz \
    
--exclude="/var/www/my_site/var/backups/*"  --exclude="/var/www/my_site/var/cache/*" \
    
--exclude="/var/www/my_site/media/catalog/product/cache/*" /var/www/my_site/

Over the age of the website, a lot of garbage collects in the media/catalog/product/cache and there’s no point in moving it around. Not restoring it allows your website to start fresh to build a cache from only the images in use. Another means of keeping the backup from being defiled by stale garbage is to get the ImageClean module, Magento has no garbage collection on removed images. Deleting images on products leaves them in place in the media/catalog/product/* subfolders till the end of time otherwise.

 
Magento Community Magento Community
Magento Community
Magento Community
 
MagenX
Enthusiast
 
Total Posts:  791
Joined:  2008-05-26
Dublin
 

oh yea
if your files are too big you can swap Dropbox with S3
creating shell script and then run it as a cron job:

first you go here install s3tools:

http://s3tools.org/repositories

then something like:

s3cmd --configure

and shell script:

echo "Compressing the backup."
tar -cvpzf  /home/backup/my_site_$(date +%a-%d-%m-%Y).tar.gz  /var/www/html
echo "Site backup compressed"

# upload file
echo "Uploading the new site backup..."
s3cmd put /home/backup/my_site_*.tar.gz  s3://backup/site/
echo "New site backup uploaded."

echo "Removing the cache files..."
# remove
find  /home/backup/my_site_* -type f -exec rm {} \;
echo 
"Files removed."
echo "All done."
 
Magento Community Magento Community
Magento Community
Magento Community
Magento Community
Magento Community
Back to top