Aiven has fully automated backup management for PostgreSQL. All backups are encrypted with service specific keys and point-in-time recovery is supported to allow recovering the system to any point in time within the backup window. Aiven stores the backups to closest available cloud storage. For more details regarding Aiven backups, check out this help article.

Currently the raw backups Aiven creates are not accessible for our customers. In case you wish to store additional set of backups, you can do this easily using standard PostgreSQL tooling. The pg_dump command allows you to create backups that can be directly restored elsewhere if need be. Typical parameters for the command include these:

pg_dump '<service_url_from_aiven_web_ui>' -f <target_file/dir> -j <number_of_jobs> -F <backup_format> 

The pg_dump command can be run also against one of the standby nodes if such exist. Simply using the replica URI from Aiven web console is enough. The -j  option is especially useful when doing that because extra load on the standby node might not be an issue.

For example, to create a backup in directory  format (which can be used directly with pg_restore ) using two concurrent jobs and storing the results to a directory called backup one would run a command like this:

pg_dump 'postgres://avnadmin:password@mypg-myproject.aivencloud.com:26882/defaultdb?sslmode=require' -f backup -j 2 -F directory 

You could then for example further put all files to single tar file and upload to S3:

export BACKUP_NAME=backup-`date -I`.tar
tar -cf $BACKUP_NAME backup/
s3cmd put $BACKUP_NAME s3://pg-backups/$BACKUP_NAME
  

Did this answer your question?