Hello @ahagopian_coj,
I think that what @MarceloMarques suggested will work. However, usually for such cases, I utilize powershell (Just a preference 😬) as it allows me more flexibility such as controlling path as a variable, building the "skip" for the files which are in use and controlling the number of days to keep the files for, kindly find a sample script below:
# Set the folder path
$folderPath = "E:\agsbackup\backup\server"
# Set the number of days
$daysOld = 7
# Get the current date
$currentDate = Get-Date
# Calculate the cutoff date
$cutoffDate = $currentDate.AddDays(-$daysOld)
# Get files older than the specified number of days
$oldFiles = Get-ChildItem -Path $folderPath | Where-Object { $_.LastWriteTime -lt $cutoffDate }
# Loop through each file and attempt to remove it, skipping if it's in use
foreach ($file in $oldFiles) {
try {
Remove-Item $file.FullName -ErrorAction Stop
Write-Host "Deleted: $($file.FullName)"
} catch {
Write-Host "Error deleting file: $($file.FullName) - $($_.Exception.Message)"
}
}
This script uses the Get-ChildItem cmdlet to retrieve files in the specified folder, filters them based on the last write time, and then attempts to delete them. The try and catches blocks handle errors, such as skipping files that are in use. Additionally, You can customize the $folderPath and $daysOld variables as needed when running the script.
You can also schedule it in Task Scheduler to run as per required frequency.
Note: I am assuming you already checked out updatebackupretaindays for ArcGIS DataStore and the BUG you mention is with this
Hope it helps!