Jump to content

Intraday fast partial database backup


epertinez

Recommended Posts

Hi everybody. This is a thread meant, surely, to stick.

As many others, I backup my customer's website daily.

Yet... if someday something goes really wrong... I can end up having a couple hundred intraday invoices lost in metaspace.

I would like to cron a process that backs up just the necessary data to add up to the system in case of failure.

What tables would you minimally backup, say, every 5-30 minutes?

How would you? With mysqldump or with a less heavy process.

For example:

ps_cart

ps_orders

ps_addresses

which more?

Maybe it is wise to save only registers that have been touched in the last 7 days?

SELECT * FROM ps_orders WHERE date_upd>"today minus 7 days"

 

Did anybody tried so? Any help will be appreciated.

 

 

 

Edited by epertinez
Minor glitch (see edit history)
Link to comment
Share on other sites

Hi,

Tables to consider:

  • ps_cart: This table contains data related to the customer's shopping carts. Since it's crucial for tracking active shopping sessions, consider including it.
  • ps_orders: This table stores information about customer orders, including order status and details. Definitely back this up.
  • ps_addresses: These are associated with customer profiles and orders, so they are important.
  • ps_customer: If you want to ensure that customer information is up to date, you might want to include this table.
  • ps_order_detail: This table contains information about the products ordered by customers.
  • ps_order_history: This table logs order status changes, which can be important for order tracking and history.
  • ps_order_carrier: Contains information about shipping carriers used for orders.
  • ps_order_payment: Stores payment information for orders.

 

You can create a backup script that runs via a cron job every 5-30 minutes. Simple example using mysqldump:

#!/bin/bash
DB_USER="your_db_user"
DB_PASS="your_db_password"
DB_NAME="your_db_name"
BACKUP_DIR="/path/to/backup/folder"
TIMESTAMP=$(date +"%Y%m%d%H%M%S")
BACKUP_FILE="$BACKUP_DIR/prestashop_data_$TIMESTAMP.sql"

# Dump the necessary tables
mysqldump -u$DB_USER -p$DB_PASS $DB_NAME ps_cart ps_orders ps_addresses ps_customer ps_order_detail ps_order_history ps_order_carrier ps_order_payment > $BACKUP_FILE

# Optionally, you can compress the backup file to save space
gzip $BACKUP_FILE

Make sure that the script is executable (chmod +x your_script.sh) and add it to your cron tab (crontab -e) to schedule its execution every 5-30 minutes as desired.

Remember to replace placeholders like your_db_user, your_db_password, and your_db_name with your actual database credentials.

 

Including only recently modified data is an excellent idea, but implementing it would require more complex logic in your script to determine which records have been updated in the last 7 days. You would need to compare the current timestamp with the date_upd field for each record in the respective tables. If the data volume is significant, this approach could become resource-intensive.

Finally, regularly test your backup and restore procedures to ensure they work as expected in case of an actual failure.

Let me know if it helps!

Thanks!

  • Like 1
Link to comment
Share on other sites

Hi,

I offer a module that I use on all the sites I have a maintenance contract with to carry out database backups on a very regular basis and make my interventions more secure, since it only saves important data, removing all tables with no sensitive data, which greatly reduces the size of the recovered archive file.

The backup is sent by email at the end of the backup if required.

And, of course, it can be run as a cron job.

 

Link to comment
Share on other sites

  • 3 weeks later...
On 9/1/2023 at 9:04 AM, AddWeb Solution said:

Hi,

Tables to consider:

  • ps_cart: This table contains data related to the customer's shopping carts. Since it's crucial for tracking active shopping sessions, consider including it.
  • ps_orders: This table stores information about customer orders, including order status and details. Definitely back this up.
  • ps_addresses: These are associated with customer profiles and orders, so they are important.
  • ps_customer: If you want to ensure that customer information is up to date, you might want to include this table.
  • ps_order_detail: This table contains information about the products ordered by customers.
  • ps_order_history: This table logs order status changes, which can be important for order tracking and history.
  • ps_order_carrier: Contains information about shipping carriers used for orders.
  • ps_order_payment: Stores payment information for orders.

 

You can create a backup script that runs via a cron job every 5-30 minutes. Simple example using mysqldump:

#!/bin/bash
DB_USER="your_db_user"
DB_PASS="your_db_password"
DB_NAME="your_db_name"
BACKUP_DIR="/path/to/backup/folder"
TIMESTAMP=$(date +"%Y%m%d%H%M%S")
BACKUP_FILE="$BACKUP_DIR/prestashop_data_$TIMESTAMP.sql"

# Dump the necessary tables
mysqldump -u$DB_USER -p$DB_PASS $DB_NAME ps_cart ps_orders ps_addresses ps_customer ps_order_detail ps_order_history ps_order_carrier ps_order_payment > $BACKUP_FILE

# Optionally, you can compress the backup file to save space
gzip $BACKUP_FILE

Make sure that the script is executable (chmod +x your_script.sh) and add it to your cron tab (crontab -e) to schedule its execution every 5-30 minutes as desired.

Remember to replace placeholders like your_db_user, your_db_password, and your_db_name with your actual database credentials.

 

Including only recently modified data is an excellent idea, but implementing it would require more complex logic in your script to determine which records have been updated in the last 7 days. You would need to compare the current timestamp with the date_upd field for each record in the respective tables. If the data volume is significant, this approach could become resource-intensive.

Finally, regularly test your backup and restore procedures to ensure they work as expected in case of an actual failure.

Let me know if it helps!

Thanks!

Great! Works like charm.

Only ps_addresses should be ps_address (in my case?), but the rest works great.

It takes 6-10 seconds to dump (plus some more to zip) so I can make it run every few minutes.
Uploading it to Dropbox is almost as easy. Once created an App Key in Dropbox you simply has to call curl and takes very little to upload.

 

HORA=$(date +"%H")
BACKUP_FILE="Name of the backup file.$HORA.sql"
DROPBOX_API="very very long key given by dropbox app"

curl -X POST https://content.dropboxapi.com/2/files/upload \
   --header "Authorization: Bearer $DROPBOX_API" \
   --header "Dropbox-API-Arg: {\"path\": \"/DropboxInternalDirYouWannaUse/$BACKUP_FILE.gz\"}" \
   --header "Content-Type: application/octet-stream" \
   --data-binary @$BACKUP_FILE.gz

See that with this code, you end up with 24 files that rotate every day, one for each hour. Easy redundancy. Not very efficient, but it works.

 

 



 

 

 

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...