Claude Build

Complete disaster recovery playbook for the WholeTech environment. Every account, every script, every step to rebuild 120 websites, the photo library, Google data archive, and all automations from scratch.

⚠ This document contains credentials and infrastructure details. Keep it private.
🖨 Printable Version

Table of Contents

  1. Environment Overview
  2. Hardware
  3. Accounts and Credentials
  4. Domain Portfolio
  5. Synology NAS Setup
  6. Beelink Setup (Hot Springs + Cedar Creek)
  7. DigitalOcean Droplet
  8. Backblaze B2 Storage
  9. Claude Code Installation
  10. 120 Websites
  11. Photo Pipeline
  12. Google Takeout Workflow
  13. Automated Systems
  14. All Scripts Reference
  15. Full Rebuild Playbook
  16. Disaster Recovery

1. Environment Overview

This is the WholeTech Network: 120 websites on a DigitalOcean droplet, a 14 TB Synology NAS holding 250,000+ original photos, two Beelink mini-PCs running automations, Backblaze B2 for cloud storage, and Claude Code orchestrating everything.

Built byPaul Walhus (@springnet) Started1996 (WholeTech), 2026 (current architecture) Primary locationsHot Springs, AR (NAS + Beelink) + Cedar Creek, TX (Beelink) Cloud locationsDigitalOcean NYC (websites), Backblaze (B2 storage) Total infrastructure cost~$20-25/month

The 3-Tier Architecture

GOOGLE / EXTERNAL SOURCES
        ↓ (Takeout, Limitless API, scraping)

  HOT SPRINGS BEELINK (Windows 11)
  · Chrome for browser-auth downloads
  · Git Bash + Python 3 for scripts
  · Claude Code for orchestration
  · Persistent mapping to NAS as Z:\
        ↓

  SYNOLOGY DS1522+ NAS (14 TB)
  · Master archive for everything
  · SMB share as \\HS_DS1522plus\home
  · Mapped to Windows as Z:
        ↓

  BACKBLAZE B2 (cloud)
  · walhus-photos (public, web serving)
  · walhus-private (encrypted backup)
        ↓

  DIGITALOCEAN DROPLET (ubuntu-s-1vcpu-1gb-nyc1)
  · 120 websites live
  · Cloud photo grading (Claude API)
  · Password-protected private data viewers
  · Daily nightly backups at 3 AM

2. Hardware

DeviceLocationRoleSpecs
Beelink (hsspabee)Hot Springs, ARPrimary automation serverWindows 11 Pro, Intel x86_64, 16+ GB RAM, 500 GB local SSD
Beelink (Cedar Creek)Cedar Creek, TXBackup/secondary — Paul remotes hereSimilar Beelink mini-PC, Windows
Synology DS1522+Hot Springs, ARMaster NAS archive5-bay NAS, 14 TB usable, SMB/AFP, DSM 7.x
NetworkHot Springs, ARInternet connection~750 Mbps download, home LAN

Critical BIOS Settings (Hot Springs Beelink)

Must be set so the Beelink auto-restarts after power outages:

Windows Auto-Login

  1. Press Windows+R, type netplwiz
  2. Uncheck "Users must enter a user name and password"
  3. Apply, enter current password twice to confirm

Sleep Settings

Settings → System → Power & battery → Screen and sleep → "Never" for sleep (screen can turn off, system cannot).

NAS Auto-Restart

DSM → Control Panel → Hardware & Power → check "Restart automatically after a power failure"

Recommended: UPS

Add an APC BE600M1 (~$60) to the NAS via USB. DSM auto-shuts down cleanly before battery dies. Protects against data corruption during Hot Springs storms. Plug the Beelink into the same UPS.

3. Accounts and Credentials

All accounts required to rebuild the environment. Store these securely - never in a public repo.

ServicePurposeAccountNotes
Google (personal)Gmail, Photos, Drive, YouTube, Calendarwalhus@gmail.com5 TB Google One plan. Has all content.
Google (business)AdSense for 120 websiteswholetechtexas@gmail.comPublisher: pub-7759195213529699
DigitalOceanDroplet hosting 120 websitespaul accountDroplet IP: 143.198.182.180 (ubuntu-s-1vcpu-1gb-nyc1-01)
Backblaze B2Cloud storage for photos + backupswalhus account2 buckets: walhus-photos (public), walhus-private
AnthropicClaude API for photo gradingwalhus accountAPI key in ~/.anthropic_api_key
GoDaddyDomain registrarpaul accountMost of the 120 domains + convcast.com
LimitlessPendant lifelog datapaul accountAPI key in ~/.limitless_api_key on NAS
GitHubCode repositoriespaulwalhusAll private repos. Needs re-auth for Claude Code scheduled agents.
Claude CodeAI development toolSigned in as PaulRuns on Beelinks, connected to Anthropic
Gmail MCPClaude's Gmail accesswalhus@gmail.comConnected via claude.ai connectors
Google Calendar MCPClaude's Calendar accesswalhus@gmail.comConnected via claude.ai connectors

Standard Credentials

Private web pagespaul / testpass1944 NAS SMBwalhu / (NAS password) Droplet SSHroot@143.198.182.180 (key-based, ~/.ssh/id_rsa) Phone501.365.1001 (Hot Springs)

B2 API Credentials

Key ID000f357396a64340000000002 Application KeyK000TqUALZteRGZA1hEqwfLqbnla1sI Key namephotos-upload

4. Domain Portfolio

120+ domains registered primarily through GoDaddy. All point to the DigitalOcean droplet (143.198.182.180) via A records.

Flagship Domains

Category Domains

Real Estatelakehamiltonhomesforsale.com, realestatehotsprings.com, bnbhot.com, realhotsprings.com, retirehotsprings.com Austinaustincribs.com, austinpads.com, austinhangout.com, austintexascoworking.com, austintechnews.live, atxtechtrends.com Techrobotnewstoday.com, aiwayback.com, far-uvclight.com, cargosolar.com Family/Historybarneyfrauenthal.com, barneyebsworth.com, carolekatchen.com Coworkingcoworkingcongress.com, texascoworking.com, coworkingretreat.com, americancoworking.com Life/Otherafterhours.party, smallhomevillage.com, offgridder.com, touroftexas.com

DNS Configuration

Every domain needs these A records in GoDaddy DNS:

TypeNameValueTTL
A@143.198.182.1801 hour
Awww143.198.182.1801 hour

Critical: Delete any GoDaddy parking/forwarding records. Keep only the two A records above. SSL is handled by Let's Encrypt on the droplet.

5. Synology NAS Setup

The DS1522+ is the master archive for everything - photos, lifelogs, website backups, Google data.

Initial Setup

1Install DSM

Follow Synology's setup wizard. Create admin account. Configure network with static IP or hostname HS_DS1522plus.

2Create Shared Folder

Control Panel → Shared Folder → Create → Name: home

Enable SMB in Control Panel → File Services

3Enable Auto-Restart

Control Panel → Hardware & Power → check "Restart automatically after a power failure"

4Folder Structure

\\HS_DS1522plus\home\
├── photos\                   (final flattened photo archive)
├── google-photos\
│   ├── zips\                 (downloaded takeout zips, deleted after extraction)
│   ├── extracted\            (unpacked during processing)
│   └── old-attempt-apr11\    (historical)
├── google-takeout\
│   ├── zips\
│   └── extracted\
├── digitalocean\             (droplet backups)
│   ├── www\                  (120 website files)
│   ├── nginx\
│   ├── letsencrypt\
│   ├── backups\              (8 daily tarballs)
│   └── crontab.txt
├── lifelog\                  (Limitless pendant data, 57+ weeks)
│   ├── YYYY-MM-DD\           (one folder per week, Sunday-dated)
│   │   ├── audio\
│   │   ├── by-day\
│   │   ├── transcripts.json
│   │   └── _summary.txt
│   ├── exports\              (manual data exports)
│   └── fetch_limitless.py
├── beelink-archive\          (overflow from C: drive)
└── digitalocean\             (repeated - daily sync destination)

Mapping from Windows

net use Z: \\HS_DS1522plus\home /persistent:yes

Or through File Explorer: Map Network Drive → Z: → \\HS_DS1522plus\home → check "Reconnect at sign-in"

7. DigitalOcean Droplet

Ubuntu droplet hosting all 120 websites, cloud photo grading, and private data viewers.

Nameubuntu-s-1vcpu-1gb-nyc1-01 IP143.198.182.180 OSUbuntu 22.04 Size1 vCPU, 1 GB RAM, 24 GB SSD RegionNYC1 Cost~$6/month

Software Stack

Nginx Configuration

Each website has a config in /etc/nginx/sites-available/DOMAIN symlinked to /etc/nginx/sites-enabled/.

Standard template:

server {
    server_name DOMAIN www.DOMAIN;
    root /var/www/DOMAIN;
    index index.html;
    location / { try_files $uri $uri/ =404; }
    listen 443 ssl;
    ssl_certificate /etc/letsencrypt/live/DOMAIN/fullchain.pem;
    ssl_certificate_key /etc/letsencrypt/live/DOMAIN/privkey.pem;
    include /etc/letsencrypt/options-ssl-nginx.conf;
    ssl_dhparam /etc/letsencrypt/ssl-dhparams.pem;
}

Private Pages (password-protected)

# Create htpasswd
htpasswd -bc /etc/nginx/.htpasswd-private paul testpass1944

# Add to nginx config
location /private/ {
    auth_basic "Private Area";
    auth_basic_user_file /etc/nginx/.htpasswd-private;
    try_files $uri $uri/ $uri/index.html =404;
}

SSL Setup (per domain)

certbot --nginx -d DOMAIN -d www.DOMAIN --non-interactive --agree-tos --email walhus@gmail.com

Nightly Backups (on droplet)

Crontab: 0 3 * * * cd /root/backups && tar -czf www-$(date +\%Y\%m\%d).tar.gz /var/www/ && ls -t www-*.tar.gz | tail -n +9 | xargs rm -f

Creates /root/backups/www-YYYYMMDD.tar.gz nightly, keeps last 8 days.

8. Backblaze B2 Storage

Account Setup

  1. Sign up at backblaze.com, choose B2 Cloud Storage
  2. Add credit card (first 10 GB free)
  3. Create Application Key: App Keys → Add New Application Key → all buckets, read/write
  4. Save keyID and applicationKey - shown ONCE

Buckets

BucketVisibilityContentsMonthly Cost
walhus-photosallPublicSafe-rated photos for web serving~$8-15
walhus-privateallPrivateEncrypted backups: lifelog, scripts, grades~$2

Creating Buckets via SDK

from b2sdk.v2 import InMemoryAccountInfo, B2Api
info = InMemoryAccountInfo()
api = B2Api(info)
api.authorize_account("production", "KEY_ID", "APP_KEY")
bucket = api.create_bucket("walhus-photos", "allPublic")

Public URL Pattern

https://f000.backblazeb2.com/file/walhus-photos/{file_path}

9. Claude Code Installation

Claude Code is the AI orchestration layer. Runs on both Beelinks. Connects to Anthropic's Claude API and MCP connectors (Gmail, Calendar).

Installation

  1. Download from claude.com/code or via npm install -g @anthropic-ai/claude-code
  2. Sign in with Paul's Anthropic account
  3. Claude Code uses the logged-in subscription for chat; separate API key for programmatic grading

Memory System

Persistent notes in C:\Users\walhu\.claude\projects\C--Users-walhu-websites\memory\:

MCP Connectors

Connected at claude.ai/settings/connectors:

Scheduled Agents (Remote)

Managed at claude.ai/code/scheduled. Run on Anthropic's cloud, independent of local hardware.

10. 120 Websites

All 120 sites live at /var/www/DOMAIN/ on the droplet. Standard template includes AdSense, Analytics, subnav bar, WholeTech branding.

Standard Site Features

Every site's index.html includes:

<meta name="google-adsense-account" content="ca-pub-7759195213529699">
<script async src="https://www.googletagmanager.com/gtag/js?id=G-MFQ0P2H8G8"></script>
<script>gtag('config','G-MFQ0P2H8G8');</script>
<script async src="https://pagead2.googlesyndication.com/pagead/js/adsbygoogle.js?client=ca-pub-7759195213529699"></script>

And the WholeTech lightning bolt "Return to network" badge:

<a href="https://wholetech.com" style="...">⚡ WholeTech</a>

Subnav Bar

Standard links: About, Resources, FAQ, News, Videos, Social, Sitemap

11. Photo Pipeline

The automated system that extracts photo zips, uploads to B2, grades with AI, and deploys galleries.

Scripts

ScriptLocationPurpose
extract-photos.shC:\Users\walhu\websites\Unzips completed takeout zips to extracted folder
upload-to-b2.pyC:\Users\walhu\websites\Uploads extracted photos to walhus-photos bucket
grade-photos.pyC:\Users\walhu\websites\AI-grades photos via Claude Haiku: score, alt, description, tags, sensitivity
build-photo-manifest.shC:\Users\walhu\websites\Scans extracted photos, builds manifest JSON
auto-pipeline.shC:\Users\walhu\websites\Watches for new zips, auto-extracts, uploads, rebuilds gallery
cloud-grade-photos.py/root/ (on droplet)Grades photos from B2 URLs - runs in the cloud, weather-proof

Pipeline Flow

takeout.google.com
    ↓ Chrome downloads to Z:\google-photos\zips\
zip file complete (.zip, not .crdownload)
    ↓ auto-pipeline.sh detects it
unzip to Z:\google-photos\extracted\Takeout\Google Photos\
    ↓
upload-to-b2.py pushes new files to walhus-photos bucket
    ↓
grade-photos.py (Beelink) OR cloud-grade-photos.py (DO) grades each photo
    ↓
JSON manifest rebuilt with all grades
    ↓
Gallery HTML injected with manifest, deployed to austinspring.com/photos

12. Google Takeout Workflow

How to extract all Google account data - the one-time export that seeds the entire archive.

1Request Takeout

Go to takeout.google.com. Deselect all. Select desired products.

Critical: Choose file size 50 GB (not default 2 GB) so you get manageable chunks.

2Wait

Google takes hours to days to prepare the export. Scheduled agent monitors Gmail for completion email.

3Download

Set Chrome download location to NAS folder before clicking download buttons. 7-day expiration on download links.

4Extract

Run extract-photos.sh or manual unzip for other data. Pipeline handles it automatically.

Parsers

ProductScriptOutput
YouTubeparse-youtube.pyDatabase + embed codes matched to websites
Gmail (.mbox)parse-gmail.pySearchable private email archive
Voice (MP3 + HTML)parse-voice.pyAudio player page with voicemails

13. Automated Systems

Windows Task Scheduler (Beelink)

TaskScheduleWhat it runs
Sync Droplet to NASDaily 4:00 AMbash /c/Users/walhu/websites/sync-droplet-to-nas.sh

Create with PowerShell:

$action = New-ScheduledTaskAction -Execute 'C:\Program Files\Git\bin\bash.exe' -Argument '-l -c /c/Users/walhu/websites/sync-droplet-to-nas.sh'
$trigger = New-ScheduledTaskTrigger -Daily -At 4:00AM
$settings = New-ScheduledTaskSettingsSet -StartWhenAvailable -DontStopIfGoingOnBatteries -AllowStartIfOnBatteries
Register-ScheduledTask -TaskName 'Sync Droplet to NAS' -Action $action -Trigger $trigger -Settings $settings -User $env:USERNAME -Force

Droplet Crontab

0 3 * * * cd /root/backups && tar -czf www-$(date +\%Y\%m\%d).tar.gz /var/www/ && ls -t www-*.tar.gz | tail -n +9 | xargs rm -f
0 2 * * * certbot renew --quiet

Limitless Lifelog Fetch

Script on NAS: Z:\lifelog\fetch_limitless.py

API key: ~/.limitless_api_key

Usage: python3 fetch_limitless.py (current week) or with date argument

Claude Code Scheduled Agent

Trigger: trig_01Ap71eVHgoVVMv7VxvUpPbc

Runs every 4 hours in Anthropic cloud. Checks Gmail for Takeout emails, creates Calendar alert + draft when found.

Manage: claude.ai/code/scheduled

14. All Scripts Reference

Every automation script, where it lives, and what it does.

ScriptLocationRuns wherePurpose
sync-droplet-to-nas.shC:\Users\walhu\websites\Beelink (Task Scheduler)Daily backup of droplet to NAS
extract-photos.shC:\Users\walhu\websites\BeelinkUnzips photo takeouts
upload-to-b2.pyC:\Users\walhu\websites\BeelinkUploads photos to B2
grade-photos.pyC:\Users\walhu\websites\BeelinkAI-grade local photos via Claude
cloud-grade-photos.py/root/ (droplet)DropletAI-grade photos from B2 URLs
auto-pipeline.shC:\Users\walhu\websites\BeelinkWatches for new zips, auto-processes
build-photo-manifest.shC:\Users\walhu\websites\BeelinkBuilds gallery JSON from extracted photos
backup-to-b2-private.pyC:\Users\walhu\websites\BeelinkBacks up lifelog, scripts, configs to private B2
deploy-gallery.shC:\Users\walhu\websites\BeelinkDeploys curated photo gallery to austinspring.com
parse-youtube.pyC:\Users\walhu\websites\Beelink / DropletParse YouTube Takeout, generate embed codes
parse-gmail.pyC:\Users\walhu\websites\Beelink / DropletParse Gmail .mbox into searchable database
parse-voice.pyC:\Users\walhu\websites\Beelink / DropletParse Google Voice Takeout
fetch_limitless.pyZ:\lifelog\ (NAS)NAS / BeelinkWeekly Limitless pendant data fetch

15. Full Rebuild Playbook

Step-by-step to rebuild the entire environment from nothing. Assumes Paul has his existing Google, DigitalOcean, GoDaddy, and Backblaze accounts.

Phase 1: Foundation (Day 1)

1Get hardware

Beelink + Synology DS1522+ (or any NAS with SMB). Connect to home network.

2Install Windows 11 + Git Bash + Python + Chrome

3Set up DSM, create home share, enable SMB, set auto-restart

4Map NAS as Z: on Beelink

net use Z: \\HS_DS1522plus\home /persistent:yes

5Create DigitalOcean droplet

Ubuntu 22.04, NYC1, ~$6/mo tier. Add SSH key from Beelink.

6Install Claude Code on Beelink

Sign in, restore memory system from backup.

Phase 2: Cloud Accounts

7Backblaze B2

Create account, create walhus-photos (public) and walhus-private (private) buckets. Generate application key.

8Anthropic API

Console.anthropic.com → API Keys → Create. Store at ~/.anthropic_api_key on both Beelink and droplet.

9Install Python packages

Beelink: pip install anthropic b2sdk

Droplet: pip3 install --break-system-packages anthropic b2sdk

Phase 3: Restore Websites

10Install nginx + certbot on droplet

apt update && apt install -y nginx certbot python3-certbot-nginx

11Restore site files from latest backup

If NAS survived: scp -r /z/digitalocean/www/www/* root@143.198.182.180:/var/www/

If B2: download latest backup tarball from walhus-private bucket.

12Restore nginx configs

scp -r /z/digitalocean/nginx/* root@143.198.182.180:/etc/nginx/

13Restore SSL certs

scp -r /z/digitalocean/letsencrypt/* root@143.198.182.180:/etc/letsencrypt/
# or re-issue:
certbot --nginx -d DOMAIN -d www.DOMAIN

14Point DNS

For each domain: GoDaddy → DNS → A records pointing to 143.198.182.180

Phase 4: Restore Data

15Request Google Takeout

takeout.google.com, 50 GB chunks, all products. Wait for email.

16Download to NAS

Set Chrome download folder to Z:\google-photos\zips\ or Z:\google-takeout\zips\.

17Run pipeline

cd /c/Users/walhu/websites
bash auto-pipeline.sh  # watches and processes zips as they land

Phase 5: Automation

18Windows Task Scheduler

Register daily droplet sync (see code in Section 13)

19Droplet crontab

Nightly backups, cert renewal

20Claude Code scheduled agent

Recreate Takeout monitor via RemoteTrigger API

16. Disaster Recovery

If the Beelink dies

If the NAS dies

If the droplet dies

If everything dies at once

Recovery Time Objective

~4 hours to get 120 websites back online after droplet loss.
~1-2 days to rebuild full photo archive from new Google Takeout.
~1 week for complete environment restoration with all automations.

Critical Files to Never Lose

  • ~/.anthropic_api_key - Claude API access
  • B2 Key ID and Application Key - only shown once
  • SSH private key (~/.ssh/id_rsa) - droplet access
  • Claude Code memory files in ~/.claude/projects/
  • photo-grades.json - 15,000+ hours of AI grading work