🖨 Printable Version · Press Ctrl+P (or Cmd+P) to print or save as PDF · ← Interactive version

WholeTech Environment

Complete Disaster Recovery Playbook · Version 1.0 · April 13, 2026

⚠ CONFIDENTIAL: This document contains credentials, API keys, SSH details, and infrastructure information. Store securely. Do not commit to public repositories. Do not share without redacting sensitive fields.

Table of Contents

  1. Environment Overview
  2. Hardware
  3. Accounts and Credentials
  4. Domain Portfolio
  5. Synology NAS Setup
  6. Beelink Setup
  7. DigitalOcean Droplet
  8. Backblaze B2 Storage
  9. Claude Code Installation
  10. 120 Websites
  11. Photo Pipeline
  12. Google Takeout Workflow
  13. Automated Systems
  14. All Scripts Reference
  15. Full Rebuild Playbook
  16. Disaster Recovery

1. Environment Overview

The WholeTech Network: 120 websites on a DigitalOcean droplet, a 14 TB Synology NAS holding 250,000+ original photos, two Beelink mini-PCs running automations, Backblaze B2 for cloud storage, and Claude Code orchestrating everything.

Built byPaul Walhus (@springnet) Started1996 (WholeTech), 2026 (current architecture) Primary locationsHot Springs, AR + Cedar Creek, TX Cloud locationsDigitalOcean NYC, Backblaze Monthly cost~$20-25/month

3-Tier Architecture

GOOGLE / EXTERNAL SOURCES
    -> (Takeout, Limitless API, scraping)

HOT SPRINGS BEELINK (Windows 11)
  - Chrome for browser-auth downloads
  - Git Bash + Python 3 for scripts
  - Claude Code for orchestration
  - Persistent mapping to NAS as Z:\
    ->

SYNOLOGY DS1522+ NAS (14 TB)
  - Master archive for everything
  - SMB share as \\HS_DS1522plus\home
  - Mapped to Windows as Z:
    ->

BACKBLAZE B2 (cloud)
  - walhus-photos (public, web serving)
  - walhus-private (encrypted backup)
    ->

DIGITALOCEAN DROPLET (NYC1)
  - 120 websites live
  - Cloud photo grading (Claude API)
  - Password-protected private data viewers
  - Daily nightly backups at 3 AM

2. Hardware

DeviceLocationRoleSpecs
Beelink (hsspabee)Hot Springs, ARPrimary automation serverWindows 11 Pro, Intel x86_64, 16+ GB RAM
Beelink (Cedar Creek)Cedar Creek, TXBackup/secondarySimilar Beelink, Windows
Synology DS1522+Hot Springs, ARMaster NAS archive5-bay, 14 TB usable, DSM 7.x
InternetHot Springs, ARHome connection~750 Mbps download

Critical BIOS Settings (Hot Springs Beelink)

Windows Auto-Login

  1. Press Windows+R, type netplwiz
  2. Uncheck "Users must enter a user name and password"
  3. Apply, enter current password twice

Sleep Settings

Settings -> System -> Power & battery -> Sleep: Never

NAS Auto-Restart

DSM -> Control Panel -> Hardware & Power -> check "Restart automatically after a power failure"

Recommended: APC BE600M1 UPS (~$60) connected to NAS via USB. DSM auto-shuts down cleanly during outages.

3. Accounts and Credentials

ServiceAccountNotes
Google (personal)walhus@gmail.com5 TB Google One plan. All content source.
Google (business)wholetechtexas@gmail.comAdSense publisher: pub-7759195213529699
DigitalOceanpaul accountDroplet IP: 143.198.182.180
Backblaze B2walhus account2 buckets: walhus-photos, walhus-private
Anthropicwalhus accountAPI key in ~/.anthropic_api_key
GoDaddypaul accountMost domains including convcast.com
Limitlesspaul accountAPI key: ~/.limitless_api_key
GitHubpaulwalhusAll private repos
Claude CodePaul's accountRuns on Beelinks

Standard Credentials

Private web pagespaul / testpass1944 NAS SMBwalhu / (NAS password) Droplet SSHroot@143.198.182.180 (key-based) Phone501.365.1001 (Hot Springs)

B2 API Credentials

Key ID000f357396a64340000000002 Application KeyK000TqUALZteRGZA1hEqwfLqbnla1sI

4. Domain Portfolio

Flagship Domains

DNS Configuration

Every domain needs these A records:

TypeNameValueTTL
A@143.198.182.1801 hour
Awww143.198.182.1801 hour

Critical: Delete any GoDaddy parking/forwarding records. Keep only the two A records. SSL handled by Let's Encrypt.

5. Synology NAS Setup

1Install DSM

Synology's wizard. Create admin. Hostname: HS_DS1522plus

2Create Shared Folder

Control Panel -> Shared Folder -> Create -> Name: home. Enable SMB.

3Enable Auto-Restart

Control Panel -> Hardware & Power -> check "Restart automatically after a power failure"

Folder Structure

\\HS_DS1522plus\home\
  photos\                    (final flattened archive)
  google-photos\
    zips\                    (takeout zips, deleted after extraction)
    extracted\               (unpacked during processing)
  google-takeout\
    zips\, extracted\
  digitalocean\              (droplet backups)
    www\, nginx\, letsencrypt\, backups\
  lifelog\                   (Limitless pendant data)
    YYYY-MM-DD\              (weekly folders)
      audio\, by-day\, transcripts.json, _summary.txt
    fetch_limitless.py
  beelink-archive\

Mapping from Windows

net use Z: \\HS_DS1522plus\home /persistent:yes

6. Beelink Setup

1Windows 11 Pro

Fresh install. Timezone, user walhu.

2Git for Windows

From git-scm.com. Provides Git Bash at C:\Program Files\Git\bin\bash.exe

3Python 3

Microsoft Store (python3)

4Chrome

Default browser for Takeout downloads

5Claude Code

From claude.com/code. Sign in with Paul's account.

6SSH Keys

ssh-keygen -t ed25519
ssh-copy-id root@143.198.182.180

7Map NAS as Z:

net use Z: \\HS_DS1522plus\home /persistent:yes

8Python Packages

pip install anthropic b2sdk

9API Keys

echo "sk-ant-api03-..." > ~/.anthropic_api_key

7. DigitalOcean Droplet

Nameubuntu-s-1vcpu-1gb-nyc1-01 IP143.198.182.180 OSUbuntu 22.04 Size1 vCPU, 1 GB RAM, 24 GB SSD Cost~$6/month

Software Stack

Standard nginx Site Template

server {
    server_name DOMAIN www.DOMAIN;
    root /var/www/DOMAIN;
    index index.html;
    location / { try_files $uri $uri/ =404; }
    listen 443 ssl;
    ssl_certificate /etc/letsencrypt/live/DOMAIN/fullchain.pem;
    ssl_certificate_key /etc/letsencrypt/live/DOMAIN/privkey.pem;
    include /etc/letsencrypt/options-ssl-nginx.conf;
    ssl_dhparam /etc/letsencrypt/ssl-dhparams.pem;
}

Private Pages (Password Protected)

htpasswd -bc /etc/nginx/.htpasswd-private paul testpass1944

location /private/ {
    auth_basic "Private Area";
    auth_basic_user_file /etc/nginx/.htpasswd-private;
    try_files $uri $uri/ $uri/index.html =404;
}

SSL Setup

certbot --nginx -d DOMAIN -d www.DOMAIN --non-interactive --agree-tos --email walhus@gmail.com

Nightly Backups (droplet crontab)

0 3 * * * cd /root/backups && tar -czf www-$(date +\%Y\%m\%d).tar.gz /var/www/ && ls -t www-*.tar.gz | tail -n +9 | xargs rm -f
0 2 * * * certbot renew --quiet

8. Backblaze B2 Storage

Setup

  1. Sign up at backblaze.com, choose B2 Cloud Storage
  2. Add credit card (first 10 GB free)
  3. Create Application Key: all buckets, read/write
  4. Save keyID + applicationKey (shown once only!)

Buckets

BucketVisibilityContentsCost
walhus-photosallPublicSafe-rated photos for web~$8-15/mo
walhus-privateallPrivateLifelog, scripts, grades, backups~$2/mo

Public URL Pattern

https://f000.backblazeb2.com/file/walhus-photos/{path}

Create Bucket via SDK

from b2sdk.v2 import InMemoryAccountInfo, B2Api
info = InMemoryAccountInfo()
api = B2Api(info)
api.authorize_account("production", KEY_ID, APP_KEY)
bucket = api.create_bucket("walhus-photos", "allPublic")

9. Claude Code Installation

Install

  1. Download from claude.com/code
  2. Sign in with Paul's Anthropic account

Memory System

Persistent notes in C:\Users\walhu\.claude\projects\C--Users-walhu-websites\memory\

MCP Connectors

Connected at claude.ai/settings/connectors:

Scheduled Agents

Managed at claude.ai/code/scheduled. Run in Anthropic cloud, independent of local hardware.

10. 120 Websites

Standard Template (in every index.html)

<meta name="google-adsense-account" content="ca-pub-7759195213529699">
<script async src="https://www.googletagmanager.com/gtag/js?id=G-MFQ0P2H8G8">
</script>
<script>gtag('config','G-MFQ0P2H8G8');</script>
<script async src="https://pagead2.googlesyndication.com/pagead/
  js/adsbygoogle.js?client=ca-pub-7759195213529699"></script>

WholeTech Back Badge

<a href="https://wholetech.com" style="...">⚡ WholeTech</a>

Standard Subnav

About, Resources, FAQ, News, Videos, Social, Sitemap

11. Photo Pipeline

Scripts

ScriptLocationPurpose
extract-photos.shwebsites\Unzips takeout zips
upload-to-b2.pywebsites\Upload photos to B2
grade-photos.pywebsites\AI-grade via Claude Haiku
build-photo-manifest.shwebsites\Build gallery JSON
auto-pipeline.shwebsites\Watches zips, auto-processes
cloud-grade-photos.py/root/ (droplet)Cloud grading from B2 URLs

Pipeline Flow

takeout.google.com
  -> Chrome downloads to Z:\google-photos\zips\
  -> auto-pipeline.sh detects completed .zip
  -> unzip to Z:\google-photos\extracted\
  -> upload-to-b2.py pushes to walhus-photos bucket
  -> grade-photos.py grades each photo with Claude
  -> JSON manifest rebuilt with all grades
  -> Gallery HTML injected, deployed to austinspring.com/photos

12. Google Takeout Workflow

1Request Takeout

takeout.google.com. Deselect all. Select products. Choose 50 GB chunks (not 2 GB default).

2Wait

Hours to days. Scheduled agent monitors Gmail for completion email.

3Download

Set Chrome download location to NAS folder. 7-day link expiration.

4Extract

Pipeline handles automatically.

Parsers

ProductScriptOutput
YouTubeparse-youtube.pyDatabase + embeds matched to sites
Gmailparse-gmail.pySearchable email archive
Voiceparse-voice.pyAudio player with voicemails

13. Automated Systems

Windows Task Scheduler

TaskScheduleScript
Sync Droplet to NASDaily 4:00 AMsync-droplet-to-nas.sh

Create task:

$action = New-ScheduledTaskAction -Execute
  'C:\Program Files\Git\bin\bash.exe' -Argument
  '-l -c /c/Users/walhu/websites/sync-droplet-to-nas.sh'
$trigger = New-ScheduledTaskTrigger -Daily -At 4:00AM
$settings = New-ScheduledTaskSettingsSet -StartWhenAvailable
Register-ScheduledTask -TaskName 'Sync Droplet to NAS'
  -Action $action -Trigger $trigger -Settings $settings
  -User $env:USERNAME -Force

Droplet Crontab

0 3 * * * cd /root/backups && tar -czf www-$(date +%Y%m%d).tar.gz /var/www/
0 2 * * * certbot renew --quiet

Limitless Lifelog Fetch

Script on NAS: Z:\lifelog\fetch_limitless.py

API key: ~/.limitless_api_key

Usage: python3 fetch_limitless.py [YYYY-MM-DD]

Claude Code Scheduled Agent

Trigger ID: trig_01Ap71eVHgoVVMv7VxvUpPbc

Every 4 hours. Checks Gmail for Takeout emails. Creates Calendar alert + Gmail draft on detection.

14. All Scripts Reference

ScriptRuns WherePurpose
sync-droplet-to-nas.shBeelinkDaily droplet backup to NAS
extract-photos.shBeelinkUnzip photo takeouts
upload-to-b2.pyBeelinkPush to walhus-photos bucket
grade-photos.pyBeelinkLocal AI grading via Claude
cloud-grade-photos.pyDropletCloud grading from B2 URLs
auto-pipeline.shBeelinkWatches zips, auto-processes
build-photo-manifest.shBeelinkGallery JSON builder
backup-to-b2-private.pyBeelinkBack up lifelog/scripts to private B2
deploy-gallery.shBeelinkDeploy gallery to austinspring.com
parse-youtube.pyBeelink/DropletParse YouTube Takeout
parse-gmail.pyBeelink/DropletParse Gmail .mbox
parse-voice.pyBeelink/DropletParse Google Voice
fetch_limitless.pyNAS/BeelinkWeekly Limitless fetch

15. Full Rebuild Playbook

Phase 1: Foundation (Day 1)

1Hardware

Beelink + Synology NAS, home network

2OS install

Windows 11 + Git Bash + Python + Chrome on Beelink

3DSM setup

Home share, SMB enabled, auto-restart configured

4Map NAS

net use Z: \\HS_DS1522plus\home /persistent:yes

5Create droplet

Ubuntu 22.04, NYC1, ~$6/mo. Add SSH key.

6Install Claude Code

Sign in, restore memory from backup

Phase 2: Cloud Accounts

7Backblaze

Create account, both buckets, application key

8Anthropic API

API key to ~/.anthropic_api_key on both systems

9Python packages

pip install anthropic b2sdk  # Beelink
pip3 install --break-system-packages anthropic b2sdk  # Droplet

Phase 3: Restore Websites

10Install nginx + certbot

apt install -y nginx certbot python3-certbot-nginx

11Restore site files

scp -r /z/digitalocean/www/www/* root@DROPLET:/var/www/

12Restore nginx/SSL

scp -r /z/digitalocean/nginx/* root@DROPLET:/etc/nginx/
scp -r /z/digitalocean/letsencrypt/* root@DROPLET:/etc/letsencrypt/

13DNS

Each domain: A records pointing to new droplet IP

Phase 4: Restore Data

14Google Takeout

All products, 50 GB chunks

15Download + Extract

Pipeline handles automatically

Phase 5: Automation

16Task Scheduler

Register daily sync (PowerShell, see Section 13)

17Droplet cron

Nightly backups + cert renewal

18Scheduled agent

Recreate Takeout monitor via RemoteTrigger

16. Disaster Recovery

If the Beelink dies

If the NAS dies

If the droplet dies

If everything dies at once

Recovery Time Objectives

~4 hours to get 120 websites back online after droplet loss
~1-2 days to rebuild photo archive from new Google Takeout
~1 week for complete environment restoration with all automations

Critical Files to Never Lose