Complete disaster recovery playbook for the WholeTech environment. Every account, every script, every step to rebuild 120 websites, the photo library, Google data archive, and all automations from scratch.
This is the WholeTech Network: 120 websites on a DigitalOcean droplet, a 14 TB Synology NAS holding 250,000+ original photos, two Beelink mini-PCs running automations, Backblaze B2 for cloud storage, and Claude Code orchestrating everything.
GOOGLE / EXTERNAL SOURCES
↓ (Takeout, Limitless API, scraping)
HOT SPRINGS BEELINK (Windows 11)
· Chrome for browser-auth downloads
· Git Bash + Python 3 for scripts
· Claude Code for orchestration
· Persistent mapping to NAS as Z:\
↓
SYNOLOGY DS1522+ NAS (14 TB)
· Master archive for everything
· SMB share as \\HS_DS1522plus\home
· Mapped to Windows as Z:
↓
BACKBLAZE B2 (cloud)
· walhus-photos (public, web serving)
· walhus-private (encrypted backup)
↓
DIGITALOCEAN DROPLET (ubuntu-s-1vcpu-1gb-nyc1)
· 120 websites live
· Cloud photo grading (Claude API)
· Password-protected private data viewers
· Daily nightly backups at 3 AM
| Device | Location | Role | Specs |
|---|---|---|---|
| Beelink (hsspabee) | Hot Springs, AR | Primary automation server | Windows 11 Pro, Intel x86_64, 16+ GB RAM, 500 GB local SSD |
| Beelink (Cedar Creek) | Cedar Creek, TX | Backup/secondary — Paul remotes here | Similar Beelink mini-PC, Windows |
| Synology DS1522+ | Hot Springs, AR | Master NAS archive | 5-bay NAS, 14 TB usable, SMB/AFP, DSM 7.x |
| Network | Hot Springs, AR | Internet connection | ~750 Mbps download, home LAN |
Must be set so the Beelink auto-restarts after power outages:
netplwizSettings → System → Power & battery → Screen and sleep → "Never" for sleep (screen can turn off, system cannot).
DSM → Control Panel → Hardware & Power → check "Restart automatically after a power failure"
Add an APC BE600M1 (~$60) to the NAS via USB. DSM auto-shuts down cleanly before battery dies. Protects against data corruption during Hot Springs storms. Plug the Beelink into the same UPS.
All accounts required to rebuild the environment. Store these securely - never in a public repo.
| Service | Purpose | Account | Notes |
|---|---|---|---|
| Google (personal) | Gmail, Photos, Drive, YouTube, Calendar | walhus@gmail.com | 5 TB Google One plan. Has all content. |
| Google (business) | AdSense for 120 websites | wholetechtexas@gmail.com | Publisher: pub-7759195213529699 |
| DigitalOcean | Droplet hosting 120 websites | paul account | Droplet IP: 143.198.182.180 (ubuntu-s-1vcpu-1gb-nyc1-01) |
| Backblaze B2 | Cloud storage for photos + backups | walhus account | 2 buckets: walhus-photos (public), walhus-private |
| Anthropic | Claude API for photo grading | walhus account | API key in ~/.anthropic_api_key |
| GoDaddy | Domain registrar | paul account | Most of the 120 domains + convcast.com |
| Limitless | Pendant lifelog data | paul account | API key in ~/.limitless_api_key on NAS |
| GitHub | Code repositories | paulwalhus | All private repos. Needs re-auth for Claude Code scheduled agents. |
| Claude Code | AI development tool | Signed in as Paul | Runs on Beelinks, connected to Anthropic |
| Gmail MCP | Claude's Gmail access | walhus@gmail.com | Connected via claude.ai connectors |
| Google Calendar MCP | Claude's Calendar access | walhus@gmail.com | Connected via claude.ai connectors |
paul / testpass1944
NAS SMBwalhu / (NAS password)
Droplet SSHroot@143.198.182.180 (key-based, ~/.ssh/id_rsa)
Phone501.365.1001 (Hot Springs)
000f357396a64340000000002
Application KeyK000TqUALZteRGZA1hEqwfLqbnla1sI
Key namephotos-upload
120+ domains registered primarily through GoDaddy. All point to the DigitalOcean droplet (143.198.182.180) via A records.
Every domain needs these A records in GoDaddy DNS:
| Type | Name | Value | TTL |
|---|---|---|---|
| A | @ | 143.198.182.180 | 1 hour |
| A | www | 143.198.182.180 | 1 hour |
Critical: Delete any GoDaddy parking/forwarding records. Keep only the two A records above. SSL is handled by Let's Encrypt on the droplet.
The DS1522+ is the master archive for everything - photos, lifelogs, website backups, Google data.
Follow Synology's setup wizard. Create admin account. Configure network with static IP or hostname HS_DS1522plus.
Control Panel → Shared Folder → Create → Name: home
Enable SMB in Control Panel → File Services
Control Panel → Hardware & Power → check "Restart automatically after a power failure"
\\HS_DS1522plus\home\
├── photos\ (final flattened photo archive)
├── google-photos\
│ ├── zips\ (downloaded takeout zips, deleted after extraction)
│ ├── extracted\ (unpacked during processing)
│ └── old-attempt-apr11\ (historical)
├── google-takeout\
│ ├── zips\
│ └── extracted\
├── digitalocean\ (droplet backups)
│ ├── www\ (120 website files)
│ ├── nginx\
│ ├── letsencrypt\
│ ├── backups\ (8 daily tarballs)
│ └── crontab.txt
├── lifelog\ (Limitless pendant data, 57+ weeks)
│ ├── YYYY-MM-DD\ (one folder per week, Sunday-dated)
│ │ ├── audio\
│ │ ├── by-day\
│ │ ├── transcripts.json
│ │ └── _summary.txt
│ ├── exports\ (manual data exports)
│ └── fetch_limitless.py
├── beelink-archive\ (overflow from C: drive)
└── digitalocean\ (repeated - daily sync destination)
net use Z: \\HS_DS1522plus\home /persistent:yes
Or through File Explorer: Map Network Drive → Z: → \\HS_DS1522plus\home → check "Reconnect at sign-in"
Windows 11 Pro box running Chrome, Git Bash, Python, and Claude Code. Two instances (Hot Springs primary, Cedar Creek backup).
Fresh install. Configure timezone, keyboard, user walhu.
Download from git-scm.com. Install with default options. Provides Git Bash (C:\Program Files\Git\bin\bash.exe).
Install from Microsoft Store (python3). Used by all the automation scripts.
Default browser for Takeout downloads, testing, Claude Code interactions.
Install from claude.com/code. Sign in with Paul's account.
Generate SSH keypair: ssh-keygen -t ed25519
Add public key to DigitalOcean droplet: ssh-copy-id root@143.198.182.180
net use Z: \\HS_DS1522plus\home /persistent:yes
pip install anthropic b2sdk
echo "sk-ant-api03-..." > ~/.anthropic_api_key
# For grading photos via Claude
Ubuntu droplet hosting all 120 websites, cloud photo grading, and private data viewers.
pip3 install --break-system-packagesEach website has a config in /etc/nginx/sites-available/DOMAIN symlinked to /etc/nginx/sites-enabled/.
Standard template:
server {
server_name DOMAIN www.DOMAIN;
root /var/www/DOMAIN;
index index.html;
location / { try_files $uri $uri/ =404; }
listen 443 ssl;
ssl_certificate /etc/letsencrypt/live/DOMAIN/fullchain.pem;
ssl_certificate_key /etc/letsencrypt/live/DOMAIN/privkey.pem;
include /etc/letsencrypt/options-ssl-nginx.conf;
ssl_dhparam /etc/letsencrypt/ssl-dhparams.pem;
}
# Create htpasswd
htpasswd -bc /etc/nginx/.htpasswd-private paul testpass1944
# Add to nginx config
location /private/ {
auth_basic "Private Area";
auth_basic_user_file /etc/nginx/.htpasswd-private;
try_files $uri $uri/ $uri/index.html =404;
}
certbot --nginx -d DOMAIN -d www.DOMAIN --non-interactive --agree-tos --email walhus@gmail.com
Crontab: 0 3 * * * cd /root/backups && tar -czf www-$(date +\%Y\%m\%d).tar.gz /var/www/ && ls -t www-*.tar.gz | tail -n +9 | xargs rm -f
Creates /root/backups/www-YYYYMMDD.tar.gz nightly, keeps last 8 days.
| Bucket | Visibility | Contents | Monthly Cost |
|---|---|---|---|
| walhus-photos | allPublic | Safe-rated photos for web serving | ~$8-15 |
| walhus-private | allPrivate | Encrypted backups: lifelog, scripts, grades | ~$2 |
from b2sdk.v2 import InMemoryAccountInfo, B2Api
info = InMemoryAccountInfo()
api = B2Api(info)
api.authorize_account("production", "KEY_ID", "APP_KEY")
bucket = api.create_bucket("walhus-photos", "allPublic")
https://f000.backblazeb2.com/file/walhus-photos/{file_path}
Claude Code is the AI orchestration layer. Runs on both Beelinks. Connects to Anthropic's Claude API and MCP connectors (Gmail, Calendar).
npm install -g @anthropic-ai/claude-codePersistent notes in C:\Users\walhu\.claude\projects\C--Users-walhu-websites\memory\:
MEMORY.md - index of all memory filesuser_*.md - user preferences, role, etc.feedback_*.md - corrections Paul has givenproject_*.md - in-progress work, contextreference_*.md - external system referencesConnected at claude.ai/settings/connectors:
Managed at claude.ai/code/scheduled. Run on Anthropic's cloud, independent of local hardware.
All 120 sites live at /var/www/DOMAIN/ on the droplet. Standard template includes AdSense, Analytics, subnav bar, WholeTech branding.
Every site's index.html includes:
<meta name="google-adsense-account" content="ca-pub-7759195213529699">
<script async src="https://www.googletagmanager.com/gtag/js?id=G-MFQ0P2H8G8"></script>
<script>gtag('config','G-MFQ0P2H8G8');</script>
<script async src="https://pagead2.googlesyndication.com/pagead/js/adsbygoogle.js?client=ca-pub-7759195213529699"></script>
And the WholeTech lightning bolt "Return to network" badge:
<a href="https://wholetech.com" style="...">⚡ WholeTech</a>
Standard links: About, Resources, FAQ, News, Videos, Social, Sitemap
The automated system that extracts photo zips, uploads to B2, grades with AI, and deploys galleries.
| Script | Location | Purpose |
|---|---|---|
extract-photos.sh | C:\Users\walhu\websites\ | Unzips completed takeout zips to extracted folder |
upload-to-b2.py | C:\Users\walhu\websites\ | Uploads extracted photos to walhus-photos bucket |
grade-photos.py | C:\Users\walhu\websites\ | AI-grades photos via Claude Haiku: score, alt, description, tags, sensitivity |
build-photo-manifest.sh | C:\Users\walhu\websites\ | Scans extracted photos, builds manifest JSON |
auto-pipeline.sh | C:\Users\walhu\websites\ | Watches for new zips, auto-extracts, uploads, rebuilds gallery |
cloud-grade-photos.py | /root/ (on droplet) | Grades photos from B2 URLs - runs in the cloud, weather-proof |
takeout.google.com
↓ Chrome downloads to Z:\google-photos\zips\
zip file complete (.zip, not .crdownload)
↓ auto-pipeline.sh detects it
unzip to Z:\google-photos\extracted\Takeout\Google Photos\
↓
upload-to-b2.py pushes new files to walhus-photos bucket
↓
grade-photos.py (Beelink) OR cloud-grade-photos.py (DO) grades each photo
↓
JSON manifest rebuilt with all grades
↓
Gallery HTML injected with manifest, deployed to austinspring.com/photos
How to extract all Google account data - the one-time export that seeds the entire archive.
Go to takeout.google.com. Deselect all. Select desired products.
Critical: Choose file size 50 GB (not default 2 GB) so you get manageable chunks.
Google takes hours to days to prepare the export. Scheduled agent monitors Gmail for completion email.
Set Chrome download location to NAS folder before clicking download buttons. 7-day expiration on download links.
Run extract-photos.sh or manual unzip for other data. Pipeline handles it automatically.
| Product | Script | Output |
|---|---|---|
| YouTube | parse-youtube.py | Database + embed codes matched to websites |
| Gmail (.mbox) | parse-gmail.py | Searchable private email archive |
| Voice (MP3 + HTML) | parse-voice.py | Audio player page with voicemails |
| Task | Schedule | What it runs |
|---|---|---|
| Sync Droplet to NAS | Daily 4:00 AM | bash /c/Users/walhu/websites/sync-droplet-to-nas.sh |
Create with PowerShell:
$action = New-ScheduledTaskAction -Execute 'C:\Program Files\Git\bin\bash.exe' -Argument '-l -c /c/Users/walhu/websites/sync-droplet-to-nas.sh'
$trigger = New-ScheduledTaskTrigger -Daily -At 4:00AM
$settings = New-ScheduledTaskSettingsSet -StartWhenAvailable -DontStopIfGoingOnBatteries -AllowStartIfOnBatteries
Register-ScheduledTask -TaskName 'Sync Droplet to NAS' -Action $action -Trigger $trigger -Settings $settings -User $env:USERNAME -Force
0 3 * * * cd /root/backups && tar -czf www-$(date +\%Y\%m\%d).tar.gz /var/www/ && ls -t www-*.tar.gz | tail -n +9 | xargs rm -f
0 2 * * * certbot renew --quiet
Script on NAS: Z:\lifelog\fetch_limitless.py
API key: ~/.limitless_api_key
Usage: python3 fetch_limitless.py (current week) or with date argument
Trigger: trig_01Ap71eVHgoVVMv7VxvUpPbc
Runs every 4 hours in Anthropic cloud. Checks Gmail for Takeout emails, creates Calendar alert + draft when found.
Manage: claude.ai/code/scheduled
Every automation script, where it lives, and what it does.
| Script | Location | Runs where | Purpose |
|---|---|---|---|
sync-droplet-to-nas.sh | C:\Users\walhu\websites\ | Beelink (Task Scheduler) | Daily backup of droplet to NAS |
extract-photos.sh | C:\Users\walhu\websites\ | Beelink | Unzips photo takeouts |
upload-to-b2.py | C:\Users\walhu\websites\ | Beelink | Uploads photos to B2 |
grade-photos.py | C:\Users\walhu\websites\ | Beelink | AI-grade local photos via Claude |
cloud-grade-photos.py | /root/ (droplet) | Droplet | AI-grade photos from B2 URLs |
auto-pipeline.sh | C:\Users\walhu\websites\ | Beelink | Watches for new zips, auto-processes |
build-photo-manifest.sh | C:\Users\walhu\websites\ | Beelink | Builds gallery JSON from extracted photos |
backup-to-b2-private.py | C:\Users\walhu\websites\ | Beelink | Backs up lifelog, scripts, configs to private B2 |
deploy-gallery.sh | C:\Users\walhu\websites\ | Beelink | Deploys curated photo gallery to austinspring.com |
parse-youtube.py | C:\Users\walhu\websites\ | Beelink / Droplet | Parse YouTube Takeout, generate embed codes |
parse-gmail.py | C:\Users\walhu\websites\ | Beelink / Droplet | Parse Gmail .mbox into searchable database |
parse-voice.py | C:\Users\walhu\websites\ | Beelink / Droplet | Parse Google Voice Takeout |
fetch_limitless.py | Z:\lifelog\ (NAS) | NAS / Beelink | Weekly Limitless pendant data fetch |
Step-by-step to rebuild the entire environment from nothing. Assumes Paul has his existing Google, DigitalOcean, GoDaddy, and Backblaze accounts.
Beelink + Synology DS1522+ (or any NAS with SMB). Connect to home network.
home share, enable SMB, set auto-restartnet use Z: \\HS_DS1522plus\home /persistent:yesUbuntu 22.04, NYC1, ~$6/mo tier. Add SSH key from Beelink.
Sign in, restore memory system from backup.
Create account, create walhus-photos (public) and walhus-private (private) buckets. Generate application key.
Console.anthropic.com → API Keys → Create. Store at ~/.anthropic_api_key on both Beelink and droplet.
Beelink: pip install anthropic b2sdk
Droplet: pip3 install --break-system-packages anthropic b2sdk
apt update && apt install -y nginx certbot python3-certbot-nginxIf NAS survived: scp -r /z/digitalocean/www/www/* root@143.198.182.180:/var/www/
If B2: download latest backup tarball from walhus-private bucket.
scp -r /z/digitalocean/nginx/* root@143.198.182.180:/etc/nginx/scp -r /z/digitalocean/letsencrypt/* root@143.198.182.180:/etc/letsencrypt/
# or re-issue:
certbot --nginx -d DOMAIN -d www.DOMAINFor each domain: GoDaddy → DNS → A records pointing to 143.198.182.180
takeout.google.com, 50 GB chunks, all products. Wait for email.
Set Chrome download folder to Z:\google-photos\zips\ or Z:\google-takeout\zips\.
cd /c/Users/walhu/websites
bash auto-pipeline.sh # watches and processes zips as they landRegister daily droplet sync (see code in Section 13)
Nightly backups, cert renewal
Recreate Takeout monitor via RemoteTrigger API
lifelog/, scripts/, photo-grades.json/z/digitalocean/www/www/*walhus-private has scripts, grades, lifelog transcriptswalhus-photos has safe-rated public photos~4 hours to get 120 websites back online after droplet loss.
~1-2 days to rebuild full photo archive from new Google Takeout.
~1 week for complete environment restoration with all automations.
~/.anthropic_api_key - Claude API access~/.ssh/id_rsa) - droplet access~/.claude/projects/photo-grades.json - 15,000+ hours of AI grading work