Complete Disaster Recovery Playbook · Version 1.0 · April 13, 2026
The WholeTech Network: 120 websites on a DigitalOcean droplet, a 14 TB Synology NAS holding 250,000+ original photos, two Beelink mini-PCs running automations, Backblaze B2 for cloud storage, and Claude Code orchestrating everything.
GOOGLE / EXTERNAL SOURCES
-> (Takeout, Limitless API, scraping)
HOT SPRINGS BEELINK (Windows 11)
- Chrome for browser-auth downloads
- Git Bash + Python 3 for scripts
- Claude Code for orchestration
- Persistent mapping to NAS as Z:\
->
SYNOLOGY DS1522+ NAS (14 TB)
- Master archive for everything
- SMB share as \\HS_DS1522plus\home
- Mapped to Windows as Z:
->
BACKBLAZE B2 (cloud)
- walhus-photos (public, web serving)
- walhus-private (encrypted backup)
->
DIGITALOCEAN DROPLET (NYC1)
- 120 websites live
- Cloud photo grading (Claude API)
- Password-protected private data viewers
- Daily nightly backups at 3 AM
| Device | Location | Role | Specs |
|---|---|---|---|
| Beelink (hsspabee) | Hot Springs, AR | Primary automation server | Windows 11 Pro, Intel x86_64, 16+ GB RAM |
| Beelink (Cedar Creek) | Cedar Creek, TX | Backup/secondary | Similar Beelink, Windows |
| Synology DS1522+ | Hot Springs, AR | Master NAS archive | 5-bay, 14 TB usable, DSM 7.x |
| Internet | Hot Springs, AR | Home connection | ~750 Mbps download |
netplwizSettings -> System -> Power & battery -> Sleep: Never
DSM -> Control Panel -> Hardware & Power -> check "Restart automatically after a power failure"
| Service | Account | Notes |
|---|---|---|
| Google (personal) | walhus@gmail.com | 5 TB Google One plan. All content source. |
| Google (business) | wholetechtexas@gmail.com | AdSense publisher: pub-7759195213529699 |
| DigitalOcean | paul account | Droplet IP: 143.198.182.180 |
| Backblaze B2 | walhus account | 2 buckets: walhus-photos, walhus-private |
| Anthropic | walhus account | API key in ~/.anthropic_api_key |
| GoDaddy | paul account | Most domains including convcast.com |
| Limitless | paul account | API key: ~/.limitless_api_key |
| GitHub | paulwalhus | All private repos |
| Claude Code | Paul's account | Runs on Beelinks |
000f357396a64340000000002
Application KeyK000TqUALZteRGZA1hEqwfLqbnla1sI
Every domain needs these A records:
| Type | Name | Value | TTL |
|---|---|---|---|
| A | @ | 143.198.182.180 | 1 hour |
| A | www | 143.198.182.180 | 1 hour |
Critical: Delete any GoDaddy parking/forwarding records. Keep only the two A records. SSL handled by Let's Encrypt.
Synology's wizard. Create admin. Hostname: HS_DS1522plus
Control Panel -> Shared Folder -> Create -> Name: home. Enable SMB.
Control Panel -> Hardware & Power -> check "Restart automatically after a power failure"
\\HS_DS1522plus\home\
photos\ (final flattened archive)
google-photos\
zips\ (takeout zips, deleted after extraction)
extracted\ (unpacked during processing)
google-takeout\
zips\, extracted\
digitalocean\ (droplet backups)
www\, nginx\, letsencrypt\, backups\
lifelog\ (Limitless pendant data)
YYYY-MM-DD\ (weekly folders)
audio\, by-day\, transcripts.json, _summary.txt
fetch_limitless.py
beelink-archive\
net use Z: \\HS_DS1522plus\home /persistent:yes
Fresh install. Timezone, user walhu.
From git-scm.com. Provides Git Bash at C:\Program Files\Git\bin\bash.exe
Microsoft Store (python3)
Default browser for Takeout downloads
From claude.com/code. Sign in with Paul's account.
ssh-keygen -t ed25519 ssh-copy-id root@143.198.182.180
net use Z: \\HS_DS1522plus\home /persistent:yes
pip install anthropic b2sdk
echo "sk-ant-api03-..." > ~/.anthropic_api_key
server {
server_name DOMAIN www.DOMAIN;
root /var/www/DOMAIN;
index index.html;
location / { try_files $uri $uri/ =404; }
listen 443 ssl;
ssl_certificate /etc/letsencrypt/live/DOMAIN/fullchain.pem;
ssl_certificate_key /etc/letsencrypt/live/DOMAIN/privkey.pem;
include /etc/letsencrypt/options-ssl-nginx.conf;
ssl_dhparam /etc/letsencrypt/ssl-dhparams.pem;
}
htpasswd -bc /etc/nginx/.htpasswd-private paul testpass1944
location /private/ {
auth_basic "Private Area";
auth_basic_user_file /etc/nginx/.htpasswd-private;
try_files $uri $uri/ $uri/index.html =404;
}
certbot --nginx -d DOMAIN -d www.DOMAIN --non-interactive --agree-tos --email walhus@gmail.com
0 3 * * * cd /root/backups && tar -czf www-$(date +\%Y\%m\%d).tar.gz /var/www/ && ls -t www-*.tar.gz | tail -n +9 | xargs rm -f 0 2 * * * certbot renew --quiet
| Bucket | Visibility | Contents | Cost |
|---|---|---|---|
| walhus-photos | allPublic | Safe-rated photos for web | ~$8-15/mo |
| walhus-private | allPrivate | Lifelog, scripts, grades, backups | ~$2/mo |
https://f000.backblazeb2.com/file/walhus-photos/{path}
from b2sdk.v2 import InMemoryAccountInfo, B2Api
info = InMemoryAccountInfo()
api = B2Api(info)
api.authorize_account("production", KEY_ID, APP_KEY)
bucket = api.create_bucket("walhus-photos", "allPublic")
Persistent notes in C:\Users\walhu\.claude\projects\C--Users-walhu-websites\memory\
MEMORY.md - index of all memory filesuser_*.md - user preferencesfeedback_*.md - corrections/guidanceproject_*.md - in-progress work contextreference_*.md - external system referencesConnected at claude.ai/settings/connectors:
Managed at claude.ai/code/scheduled. Run in Anthropic cloud, independent of local hardware.
<meta name="google-adsense-account" content="ca-pub-7759195213529699">
<script async src="https://www.googletagmanager.com/gtag/js?id=G-MFQ0P2H8G8">
</script>
<script>gtag('config','G-MFQ0P2H8G8');</script>
<script async src="https://pagead2.googlesyndication.com/pagead/
js/adsbygoogle.js?client=ca-pub-7759195213529699"></script>
<a href="https://wholetech.com" style="...">⚡ WholeTech</a>
About, Resources, FAQ, News, Videos, Social, Sitemap
| Script | Location | Purpose |
|---|---|---|
| extract-photos.sh | websites\ | Unzips takeout zips |
| upload-to-b2.py | websites\ | Upload photos to B2 |
| grade-photos.py | websites\ | AI-grade via Claude Haiku |
| build-photo-manifest.sh | websites\ | Build gallery JSON |
| auto-pipeline.sh | websites\ | Watches zips, auto-processes |
| cloud-grade-photos.py | /root/ (droplet) | Cloud grading from B2 URLs |
takeout.google.com -> Chrome downloads to Z:\google-photos\zips\ -> auto-pipeline.sh detects completed .zip -> unzip to Z:\google-photos\extracted\ -> upload-to-b2.py pushes to walhus-photos bucket -> grade-photos.py grades each photo with Claude -> JSON manifest rebuilt with all grades -> Gallery HTML injected, deployed to austinspring.com/photos
takeout.google.com. Deselect all. Select products. Choose 50 GB chunks (not 2 GB default).
Hours to days. Scheduled agent monitors Gmail for completion email.
Set Chrome download location to NAS folder. 7-day link expiration.
Pipeline handles automatically.
| Product | Script | Output |
|---|---|---|
| YouTube | parse-youtube.py | Database + embeds matched to sites |
| Gmail | parse-gmail.py | Searchable email archive |
| Voice | parse-voice.py | Audio player with voicemails |
| Task | Schedule | Script |
|---|---|---|
| Sync Droplet to NAS | Daily 4:00 AM | sync-droplet-to-nas.sh |
Create task:
$action = New-ScheduledTaskAction -Execute 'C:\Program Files\Git\bin\bash.exe' -Argument '-l -c /c/Users/walhu/websites/sync-droplet-to-nas.sh' $trigger = New-ScheduledTaskTrigger -Daily -At 4:00AM $settings = New-ScheduledTaskSettingsSet -StartWhenAvailable Register-ScheduledTask -TaskName 'Sync Droplet to NAS' -Action $action -Trigger $trigger -Settings $settings -User $env:USERNAME -Force
0 3 * * * cd /root/backups && tar -czf www-$(date +%Y%m%d).tar.gz /var/www/ 0 2 * * * certbot renew --quiet
Script on NAS: Z:\lifelog\fetch_limitless.py
API key: ~/.limitless_api_key
Usage: python3 fetch_limitless.py [YYYY-MM-DD]
Trigger ID: trig_01Ap71eVHgoVVMv7VxvUpPbc
Every 4 hours. Checks Gmail for Takeout emails. Creates Calendar alert + Gmail draft on detection.
| Script | Runs Where | Purpose |
|---|---|---|
| sync-droplet-to-nas.sh | Beelink | Daily droplet backup to NAS |
| extract-photos.sh | Beelink | Unzip photo takeouts |
| upload-to-b2.py | Beelink | Push to walhus-photos bucket |
| grade-photos.py | Beelink | Local AI grading via Claude |
| cloud-grade-photos.py | Droplet | Cloud grading from B2 URLs |
| auto-pipeline.sh | Beelink | Watches zips, auto-processes |
| build-photo-manifest.sh | Beelink | Gallery JSON builder |
| backup-to-b2-private.py | Beelink | Back up lifelog/scripts to private B2 |
| deploy-gallery.sh | Beelink | Deploy gallery to austinspring.com |
| parse-youtube.py | Beelink/Droplet | Parse YouTube Takeout |
| parse-gmail.py | Beelink/Droplet | Parse Gmail .mbox |
| parse-voice.py | Beelink/Droplet | Parse Google Voice |
| fetch_limitless.py | NAS/Beelink | Weekly Limitless fetch |
Beelink + Synology NAS, home network
Windows 11 + Git Bash + Python + Chrome on Beelink
Home share, SMB enabled, auto-restart configured
net use Z: \\HS_DS1522plus\home /persistent:yes
Ubuntu 22.04, NYC1, ~$6/mo. Add SSH key.
Sign in, restore memory from backup
Create account, both buckets, application key
API key to ~/.anthropic_api_key on both systems
pip install anthropic b2sdk # Beelink pip3 install --break-system-packages anthropic b2sdk # Droplet
apt install -y nginx certbot python3-certbot-nginx
scp -r /z/digitalocean/www/www/* root@DROPLET:/var/www/
scp -r /z/digitalocean/nginx/* root@DROPLET:/etc/nginx/ scp -r /z/digitalocean/letsencrypt/* root@DROPLET:/etc/letsencrypt/
Each domain: A records pointing to new droplet IP
All products, 50 GB chunks
Pipeline handles automatically
Register daily sync (PowerShell, see Section 13)
Nightly backups + cert renewal
Recreate Takeout monitor via RemoteTrigger
/z/digitalocean/www/www/*~4 hours to get 120 websites back online after droplet loss
~1-2 days to rebuild photo archive from new Google Takeout
~1 week for complete environment restoration with all automations