Task Schedule Delete Datastore Backups Action Trigger command?

417
2
12-12-2023 06:38 AM
ahagopian_coj
Occasional Contributor

Hello!  

We have learned that our 10.9.1 version of Enterprise has a bug that does not allow backups to delete ever. According to support they are not going to fix it and said we should upgrade since no one at 11.0 has said this is an issue. We can't upgrade. We're stuck at 10.9.1 for now.  I am trying to create a task on the server for deleting the backups more than 7 days old.  It did run successfully but it deleted all the backups. Does anyone have a command that successfully works?

forfiles /p "backup file path" /s /m db* /D -7 /C  "cmd /c del /q @path"

Any suggestions or tips would really help.  We really need this to work.  Our hard drive keeps filling up with the backups and then the datatore goes into read only mode which is super bad for us.

Thanks!

 

 

0 Kudos
2 Replies
MarceloMarques
Esri Regular Contributor

This works fine for me.
deletes files 17 days older in the backup folder.
forfiles -p "E:\agsbackup\backup\server" -s -m * -d -17 -c "cmd /c del @path"

| Marcelo Marques | Principal Product Engineer | Esri |
| Cloud & Database Administrator | OCP - Oracle Certified Professional |
I work with Enterprise Geodatabases since 1997.
“ I do not fear computers. I fear the lack of them." Isaac Isimov
0 Kudos
ArchitSrivastava
Occasional Contributor II

Hello @ahagopian_coj,

I think that what @MarceloMarques suggested will work. However, usually for such cases, I utilize powershell (Just a preference 😬) as it allows me more flexibility such as controlling path as a variable, building the "skip" for the files which are in use and controlling the number of days to keep the files for, kindly find a sample script below:

# Set the folder path
$folderPath = "E:\agsbackup\backup\server"

# Set the number of days
$daysOld = 7

# Get the current date
$currentDate = Get-Date

# Calculate the cutoff date
$cutoffDate = $currentDate.AddDays(-$daysOld)

# Get files older than the specified number of days
$oldFiles = Get-ChildItem -Path $folderPath | Where-Object { $_.LastWriteTime -lt $cutoffDate }

# Loop through each file and attempt to remove it, skipping if it's in use
foreach ($file in $oldFiles) {
try {
Remove-Item $file.FullName -ErrorAction Stop
Write-Host "Deleted: $($file.FullName)"
} catch {
Write-Host "Error deleting file: $($file.FullName) - $($_.Exception.Message)"
}
}

This script uses the Get-ChildItem cmdlet to retrieve files in the specified folder, filters them based on the last write time, and then attempts to delete them. The try and catches blocks handle errors, such as skipping files that are in use. Additionally, You can customize the $folderPath and $daysOld variables as needed when running the script.

You can also schedule it in Task Scheduler to run as per required frequency. 

Note: I am assuming you already checked out updatebackupretaindays for ArcGIS DataStore and the BUG you mention is with this

Hope it helps!