Automation & Scripting

Automate FileFortress tasks with scripts and scheduled jobs

Advanced 45 minutes
1
Non-Interactive
2
Script
3
Schedule
4
Errors
5
Monitor

Step 1: Understanding Non-Interactive Mode

Essential for automated scripts

Use --non-interactive flag to suppress prompts:

filefortress remotes scan --all --non-interactive

Important

This is essential for automated scripts that run without user interaction.

Step 2: Create Your First Script

PowerShell and Bash examples

PowerShell (Windows)

# scan-daily.ps1

$keyFile = "$env:USERPROFILE\filefortress.key"

filefortress --key-file $keyFile remotes scan --all --non-interactive

Bash (Linux/Mac)

#!/bin/bash

# scan-daily.sh

KEY_FILE="$HOME/filefortress.key"

filefortress --key-file "$KEY_FILE" remotes scan --all --non-interactive

Handling Secrets Securely

Prevent secrets from appearing in command history

The Problem

When you provide secrets like AWS keys or encryption passwords directly in commands, they get saved in your shell history, making them visible to anyone with access to your terminal.

Solution 1: Environment Variables (Recommended for Automation)

FileFortress automatically reads secrets from standard environment variables:

  • FILEFORTRESS_S3_SECRET_KEY - AWS S3 Secret Access Key
  • FILEFORTRESS_B2_APPLICATION_KEY - Backblaze B2 Application Key
  • FILEFORTRESS_ENCRYPTION_PASSWORD - Remote encryption password
PowerShell Example

# Set environment variable (persists for session)

$env:FILEFORTRESS_S3_SECRET_KEY = "your-secret-key"

# Now add S3 remote in non-interactive mode without exposing secret

filefortress remotes add s3 --non-interactive --access-key AKID

Bash Example

# Export environment variable

export FILEFORTRESS_S3_SECRET_KEY="your-secret-key"

# Now add S3 remote in non-interactive mode without exposing secret

filefortress remotes add s3 --non-interactive --access-key AKID

Solution 2: Secret Files (Recommended for Local Scripts)

Store secrets in files and reference them by path:

PowerShell - Create Secret File

# Create a directory for secrets

New-Item -ItemType Directory -Force -Path "$env:USERPROFILE\.secrets"

# Save secret to file (no trailing newline)

"your-secret-key" | Out-File -FilePath "$env:USERPROFILE\.secrets\s3.key" -NoNewline -Encoding utf8

# Use the secret file in non-interactive mode

filefortress remotes add s3 --non-interactive --access-key AKID --secret-key-file "$env:USERPROFILE\.secrets\s3.key"

Bash - Create Secret File

# Create a directory for secrets

mkdir -p ~/.secrets

# Save secret to file (no trailing newline)

echo -n "your-secret-key" > ~/.secrets/s3.key

# Restrict permissions (user read-only)

chmod 600 ~/.secrets/s3.key

# Use the secret file in non-interactive mode

filefortress remotes add s3 --non-interactive --access-key AKID --secret-key-file ~/.secrets/s3.key

Custom Environment Variables

You can also use your own environment variable names:

# PowerShell

$env:MY_S3_SECRET = "your-secret-key"

filefortress remotes add s3 --non-interactive --access-key AKID --secret-key-env MY_S3_SECRET

When to Use Each Method

  • Interactive Mode (default): Best for one-time setup - secrets are masked as you type
  • Environment Variables: Best for CI/CD pipelines and automated systems - use with --non-interactive
  • Secret Files: Best for local scripts and development - use with --non-interactive

Learn more in our Security Best Practices guide.

Step 3a: Schedule Task (Windows)

Using Task Scheduler

# Create scheduled task for daily scan at 2 AM

$action = New-ScheduledTaskAction -Execute "PowerShell.exe" `

  -Argument "-File C:\Scripts\scan-daily.ps1"

$trigger = New-ScheduledTaskTrigger -Daily -At 2am

Register-ScheduledTask -TaskName "FileFortress Daily Scan" `

  -Action $action -Trigger $trigger

Step 3b: Schedule Task (Linux/Mac)

Using cron

# Edit crontab

crontab -e


# Add daily scan at 2 AM

0 2 * * * /home/username/scripts/scan-daily.sh

Step 4: Error Handling and Logging

Add logging to track script execution

# PowerShell with logging

$logFile = "$env:USERPROFILE\FileFortress\logs\scan.log"

try {

  "[$(Get-Date)] Starting scan" | Out-File $logFile -Append

  filefortress --key-file $keyFile remotes scan --all --non-interactive

  "[$(Get-Date)] Scan completed" | Out-File $logFile -Append

} catch {

  "[$(Get-Date)] Error: $_" | Out-File $logFile -Append

}

Step 5: Monitoring and Alerts

Send notifications on success or failure

# PowerShell with email notification

if ($?) {

  Send-MailMessage -To "[email protected]" `

    -Subject "Scan Success" -Body "Scan completed"

} else {

  Send-MailMessage -To "[email protected]" `

    -Subject "Scan Failed" -Body "Check logs"

}

Automated Duplicate Detection

Schedule duplicate reports and cleanup

PowerShell - Weekly Duplicate Report

# weekly-duplicate-report.ps1

$logFile = "$env:USERPROFILE\FileFortress\logs\duplicates-$(Get-Date -Format 'yyyy-MM-dd').log"

$exportFile = "$env:USERPROFILE\FileFortress\exports\duplicates-$(Get-Date -Format 'yyyy-MM-dd').txt"


try {

  "[$(Get-Date)] Starting duplicate detection" | Out-File $logFile -Append


  # Export hash-verified duplicates

  filefortress find duplicates --non-interactive \

    --export-format paths \

    --keep-strategy oldest \

    --hash-verified-only \

    --include-keep-file \

    --output-file $exportFile


  # Count duplicates found

  $duplicateCount = (Get-Content $exportFile | Where-Object { $_ -notmatch '^#' }).Count

  "[$(Get-Date)] Found $duplicateCount duplicate files" | Out-File $logFile -Append


  # Send email notification if duplicates found

  if ($duplicateCount -gt 0) {

    Send-MailMessage -To "[email protected]" `

      -Subject "FileFortress: $duplicateCount duplicates found" `

      -Body "Review the export file: $exportFile" `

      -Attachments $exportFile

  }

} catch {

  "[$(Get-Date)] Error: $_" | Out-File $logFile -Append

}

Bash - Monthly Duplicate Cleanup

#!/bin/bash

# monthly-duplicate-cleanup.sh


LOG_FILE="$HOME/filefortress/logs/duplicates-$(date +%Y-%m-%d).log"

EXPORT_FILE="$HOME/filefortress/exports/duplicates-$(date +%Y-%m-%d).json"


echo "[$(date)] Starting duplicate detection" >> "$LOG_FILE"


# Export duplicates as JSON

filefortress find duplicates --non-interactive \

  --export-format json \

  --keep-strategy by-remote \

  --keep-remote "Primary Storage" \

  --hash-verified-only \

  --output-file "$EXPORT_FILE"


# Parse JSON and get summary

DUPLICATE_COUNT=$(jq '.summary.filesToDelete' "$EXPORT_FILE")

SPACE_SAVINGS=$(jq '.summary.spaceToReclaim' "$EXPORT_FILE")


echo "[$(date)] Found $DUPLICATE_COUNT duplicates, potential savings: $SPACE_SAVINGS bytes" >> "$LOG_FILE"

Best Practices for Duplicate Automation

  • Always use --hash-verified-only for safety
  • Generate reports first, review before deleting
  • Send notifications when duplicates are found
  • Keep export files for audit trail
  • Schedule during off-peak hours

Learn more in our Duplicate Management Guide.

Automation Best Practices

  • Always use --non-interactive flag
  • Use key files instead of passwords
  • Implement comprehensive logging
  • Add error handling and notifications
  • Test scripts manually before scheduling
  • Schedule during off-peak hours

Learn More

Explore advanced automation patterns: