Efficient way of doing Off-site Vault backups

Efficient way of doing Off-site Vault backups

Mario-Villada
Advocate Advocate
1,359 Views
11 Replies
Message 1 of 12

Efficient way of doing Off-site Vault backups

Mario-Villada
Advocate
Advocate

Hi Guys, How do you do Off-site Vault backups? My way takes bout 30 hours!. I do an Rsync task from our FreeNAS box at the office to a Synology NAS at my place.

Since it takes toooo long I can only do it once per week and it clogs my home network. Is there another more efficient way of doing it?

 

We are on Vault basic 2019.

 

all the best.

0 Likes
Accepted solutions (2)
1,360 Views
11 Replies
Replies (11)
Message 2 of 12

swalton
Mentor
Mentor

As I understand it, Rsync does not use compression or deduplication.  You might look for a tool that can do those steps.

This might help:

https://jstaf.github.io/2018/03/12/backups-with-borg-rsync.html

 

We used to use Falconstor running on our SAN to push vmware snapshots (including the Vault backup files) to a remote host twice a day.   

 

We also use Vault Pro so we have incremental backups during the week and full backups on the weekend.

Steve Walton
Did you find this post helpful? Feel free to Like this post.
Did your question get successfully answered? Then click on the ACCEPT SOLUTION button.

EESignature


Inventor 2025
Vault Professional 2025
Message 3 of 12

Mario-Villada
Advocate
Advocate

Thanks mate! that link was very useful, I am keeping it for reference.

However my limited understanding of rsync is that it does compression by using -z parameter and also it is supposed to be incremental so only the changes in the data set are transferred. as a comparison I have  1TB data set that gets backed up in about 30min. this because most of its files are the same, only recent changes or new files are synchronized.

 

It looks like the Vault backup is a completely new data set each backup so rsync has to delete completely the previous backup and re-write the entire backup again which it is about 200 MB.

so, I was wondering if there is an option when doing Vault backup so only the recent changes are saved (incremental).

 

0 Likes
Message 4 of 12

GunnarNilsson
Collaborator
Collaborator
Accepted solution

I suggest that you after the backup is run on the server rename the backup to a generic name and the do a robocopy /mir to the NAS. That way you only copy changed files to the NAS.

MOVE D:\Vaultbackup\full\VaultBackup_20* D:\Vaultbackup\full\VaultBackup

ROBOCOPY "D:\Vaultbackup\full\VaultBackup" "\\NAS\Vaultbackup\full\Vaultbackup" /MIR

Message 5 of 12

Mario-Villada
Advocate
Advocate

Thanks mate! I think you got it!.

You made me realize that it as actually  a new data-set but this was my fault because I have a script that always have 2 days of backup. 1 for yesterday and other for the day before. This is achieved by initially saving the backup to a temp folder, if no errors then the previous backup in folder A is renamed to B and the temp folder is renamed to A. It looks like I will have to think of a way to synchronize the data before renaming.

 

Thank you again. see my backup script below.

@Anonymous OFF

REM THIS WILL STOP THE WEB SERVER AND "CYCLE" THE SQL SERVER
IISRESET /STOP
NET STOP MSSQL$AUTODESKVAULT
NET START MSSQL$AUTODESKVAULT

REM START Backup on Temp folder
"C:\Program Files\Autodesk\ADMS 2019\ADMS Console\Connectivity.ADMSConsole.exe" -Obackup -B"F:\Vault Backups\Temp" -VUadministrator -VP -L"F:\Vault Backups\VaultBackupLog.txt" -S

IF ERRORLEVEL 0 GOTO OverwriteLastBackup
IF ERRORLEVEL -1 GOTO End

:OverwriteLastBackup
REM DELETE B AND CASCADE A BACKUP SUBDIRECTORIES
RMDIR /Q /S "F:\Vault Backups\B"
REN "F:\Vault Backups\A" "B"
REN "F:\Vault Backups\Temp" "A"

REM CREATE A NEW DIRECTORY FOR THE BACKUP
MKDIR "F:\Vault Backups\Temp"

REM START THE WEB SERVER
IISRESET /START

REM COPY BACKUPS FOLDERS TO NETWORK LOCATION IN *FREENAS* *Mario  Modification*
RMDIR /Q /S "\\VLfreenas\Vol8TB\vault_backups\B"
REN "\\VLfreenas\Vol8TB\vault_backups\A" "B"
ROBOCOPY "F:\Vault Backups\A" "\\VLfreenas\Vol8TB\vault_backups\A" /MIR /Z /W:10 /R:5 /MT:32 /V /NP /LOG:"F:\Vault Backups\robocopyLog.txt"


:End
REM Purge "F:\Vault Backups\Temp"
RMDIR /Q /S "F:\Vault Backups\Temp"
MKDIR "F:\Vault Backups\Temp"

REM START THE WEB SERVER
IISRESET /START

 

 

Message 6 of 12

Anonymous
Not applicable

I'm curious if you reached the solution you were aiming for. I'd like to do the same - do weekly offsite incremental backups without uploading the entire backup every time.

0 Likes
Message 7 of 12

Mario-Villada
Advocate
Advocate

Not really, it takes more than 24 hrs each time. I know Vault Professional has the option to do incremental backups but we are on Basic at the moment.

0 Likes
Message 8 of 12

Anonymous
Not applicable
I was afraid of that. At least we have the script to get them to a networked drive, and I have another NAS that actively pulls backups. So even though it’s going to be slow, it’ll be automated.
0 Likes
Message 9 of 12

Anonymous
Not applicable
Accepted solution

I have a working solution! Instead of a new 300+GB backup task, I now have ~5GB each time.

 

As you're aware, the original Vault backup process involves copying all files into a new directory. The issue is that the directory itself is new every time, and incremental backup solutions need to backup anything new. Thus 100's of Gigabytes each time.

 

After some internal testing, I've generated a script that creates a dedicated copy for incremental backup solutions. Here's the way it works:

  • You run the original Vault backup.
  • Perform the standard 'cascading backup'. **
  • ROBOCOPY the latest backup's contents (as opposed to the entire directory) into an existing directory. Thus, the outer level directory is never changed, just the files within. And only the files that have changed get rewritten.
  • Backup the synchronized directory only.

** I don't have enough storage space for three versions (temp, A, B), so I trimmed it down to 'temp' and 'latest' only)

 

Here's my two batch files:

FIRST: Run Vault Backup. Cascade. Optionally copy to network folder.

 

 

 

 

 OFF

REM THIS WILL STOP THE WEB SERVER AND "CYCLE" THE SQL SERVER

echo Stopping IIS Web Server...
IISRESET /STOP
echo ..done
echo.

echo Cycling (Restarting) SQL Server (Autodesk Vault)...
NET STOP MSSQL$AUTODESKVAULT
NET START MSSQL$AUTODESKVAULT
echo ..done
echo. 

REM START Backup on Temp folder

echo Backing up Autodesk Vault..
"C:\Program Files\Autodesk\ADMS 2017\ADMS 2018\ADMS Console\Connectivity.ADMSConsole.exe" -Obackup -B"S:\Vault Backups\Temp" -VUadministrator -VPmodem -L"S:\Vault Backups\VaultBackupLog.txt" -S

REM IF ERRORLEVEL 0 GOTO Success
REM IF ERRORLEVEL -1 GOTO Fail

:Success
echo ..success
echo.

REM START THE WEB SERVER

echo Restarting IIS Web Server...
IISRESET /START
echo ..done
echo.

REM Delete the prior backup and rename the latest

echo Replacing prior local backup with current backup

echo ..removing old local backup
RMDIR /Q /S "S:\Vault Backups\Latest"

echo ..renaming new local backup
REN "S:\Vault Backups\Temp" "Latest"

REM Replace the directory for the next local backup
MKDIR "S:\Vault Backups\Temp"
echo ..done
echo.


REM SKIP BACKUP to NETWORK. IT TAKES TOO LONG. We're using Synology Active Backup instead to pull from B:\
GOTO: cleanup

REM Copy latest backup to a network location

echo Preparing to copy local backup to network location

echo ..renaming prior network backup
REN "\\someDevice\somePath\Vault Backups\Latest" "Prior"

echo ..copying latest local backup to network
ROBOCOPY "S:\Vault Backups\Latest" "\\someDevice\somePath\Vault Backups\Temp" /MIR /Z /W:10 /R:5 /MT:32 /V /NP /LOG:"S:\Vault Backups\robocopyLog.txt"

echo ..renaming new network copy
REN "\\someDevice\somePath\Vault Backups\Temp" "Latest"

echo .. removing prior network copy
RMDIR /Q /S "\\someDevice\somePath\Vault Backups\Prior"

Goto cleanup

:Fail
echo ..backup failed

REM START THE WEB SERVER

echo Restarting IIS Web Server...
IISRESET /START
echo ..done
echo.

:cleanup

echo Cleaning up Temp files..

REM Purge "S:\Vault Backups\Temp"
RMDIR /Q /S "S:\Vault Backups\Temp"

MKDIR "S:\Vault Backups\Temp"
echo ..done

 

 

 

 

As you can see... I've disabled the network backup, but my new online backup replaces it.

 

SECOND: Prepare another copy with only incremental changes

 

 

 

 

cd /d S:
cd "S:\Vault Backups\Latest\VaultBackup_*"
ROBOCOPY "databases" "B:\SyncedBackup\databases" /MIR
ROBOCOPY "filestores" "B:\SyncedBackup\filestores" /MIR
XCOPY /Y BackupContents.xml "B:\SyncedBackup"

 

 

 

 

Note, B: is a DEDICATED partition. There's nothing else on that drive except this synced folder. This lets my backup software select only that drive. 

kschnautz_1-1622295582143.png

 

kschnautz_0-1622295547250.png

 

You can combine this into a single Batch file, or just modify your task scheduler to have two actions. (That's what I did).

Message 10 of 12

Anonymous
Not applicable

FYI - the majority of the data being transferred now is the Database. At minimum, every backup will be a new myVault_dbbak and KnowledgeVaultMaster_dbbak. These are about 4GB on our server.

0 Likes
Message 11 of 12

Anonymous
Not applicable

Also, I just realized that @GunnarNilsson already suggested a variation of this. I failed to read it closely, to see that he already address this with his MOV command with a wildcard, then ROBOCOPY.

 

Oh well. At least you have a full script you (and others) can copy for future use.

0 Likes
Message 12 of 12

Mario-Villada
Advocate
Advocate
This is fantastic! Thank you so much! What a good way to solve this issue. I will definitely implement this.
0 Likes