WholeTech Network Traffic Audit

April 7, 2026 · Source: AWStats on droplet 143.198.182.180 · 119 sites analyzed

TL;DR — the headline number is misleading. AWStats shows 11,018 visits/mo across the network. After stripping out unfiltered bot traffic and looking at search-engine referrals, the realistic count of actual humans is closer to 1,500–2,500 visits/mo. The good news: one site (firth.com) is doing real organic search and shows the path forward for the rest.
119Total sites
11,018Awstats visits
~2,000Real humans (est)
11Zero-traffic sites
3AdSense authorized
12Requires review
97Getting ready
0ads.txt deployed

AdSense status snapshot (April 7, 2026 9:00 PM CDT)

🚨 Critical fix needed: ads.txt is "Not found" on every site. AdSense's per-site report shows the ads.txt status as "Not found" for all 110 submitted sites. Without an /ads.txt file at the root of each domain containing your publisher line, programmatic advertisers can't verify your inventory and CPMs will be lower. This is a one-line file per site and can be deployed network-wide in seconds. Highest-priority fix.
StatusCountMeaning
Ready1Live and serving ads (tvreviewer.com)
Requires review12Awaiting AdSense human review
Getting ready97In automated review queue (1–14 days typical)
Authorized (separate column)3tvreviewer.com, wholetech.com, aiwholetech.com
ads.txt deployed0Needs immediate deployment

Sites requiring review (12)

worpressmeetup.com, stonedom.com, atxtechtrends.com, austinspring.com, taichipaul.com, austintechnewslive.com, alexsmallenginerepair.com, changebastrop.com, askemai.com, bnbhot.com, aquachifit.com, aiwholetech.com

Submission anomalies

Top 10 by AWStats visits (raw)

#SiteVisitsUniqueReal channel
1tvreviewer.com1,013871Mostly bots
2firth.com801756Google search + type-in
3austen.com579544Type-in only
4wholetech.com342255Internal + direct
5bnbhot.com174160Mixed (DNS now points to droplet)
6austinspring.com160152Bot floor
7adultstory.com149124Bot floor
8aiwholetech.com126114Bot floor
9austinlifestyles.com122114Bot floor
10offgridder.com122114Bot floor

Part A — Why firth.com ranks (and the others don't)

firth.com is the only site in the network with measurable Google organic traffic. The drilldown explains why.

Factorfirth.comausten.comtvreviewer.com
AWStats visits8035791,013
Google search referrals14532
Other search engines4140
Direct / type-in / no-referrer892656~1,000
Identifiable bot hits4,1252,315~1,300
HTML pages on disk47419565
Indexed URLs (AWStats SIDER)370212565
Has sitemap.xmlYesYesYes
Has robots.txtYesYesYes
Schema.org markupNoneNoneNone
Domain established19981997~2024
The differentiator is age plus content depth plus subject specificity. firth.com has 474 pages of Colin-Firth-specific content built up over 28 years. austen.com is the same age but only has 19 pages — it's coasting on the premium domain alone. tvreviewer.com has 565 pages but is a brand-new domain Google doesn't trust yet. firth.com is the only site that has all three: old domain + deep content + tightly focused subject.

What firth.com's organic traffic actually looks like

Top landing pages from AWStats SIDER:

URLHits
/267
/about.html57
/about48
/awards.html21
/bio.html13
/news.html12
/upcoming.html11
/resources.html10
/father_rev.html10

People are searching things like "Colin Firth," "Colin Firth biography," "Colin Firth awards," and landing on the homepage and about pages. This is exactly what you'd expect for a fan site that's been online since 1998.

What austen.com is missing

austen.com has the same age and same robots/sitemap setup as firth.com but only 19 HTML pages. It should be ranking for "Jane Austen novels," "Pride and Prejudice analysis," "Sense and Sensibility text," "Emma full text," etc. — and it isn't, because there isn't enough content for Google to rank against. It's surviving on type-in traffic alone, which is fragile (no SEO moat).

This is the highest-leverage opportunity in the network: replicate firth.com's depth on austen.com. Add a page for each novel, each major character, each adaptation. The domain authority is already there.

Part B — Google Search Console

GSC drill-down deferred. This audit was generated by Claude Code on the Beelink. There is no Google Search Console MCP integration configured in this Claude instance, so we can't pull GSC impressions/clicks/queries data programmatically. The AWStats numbers above are the best available proxy.

To unlock the next level of traffic intelligence, the recommended next steps are:

Part C — Mid-tier site drilldown (austinspring, offgridder, aquachifit)

Same drilldown applied to three randomly-chosen mid-tier sites to confirm the bot-floor hypothesis:

SiteVisitsIndexed URLsBot hitsSearch referralsVerdict
austinspring.com164155570Pure bot floor
offgridder.com12491930Pure bot floor
aquachifit.com11791371 (Baidu)Pure bot floor
Hypothesis confirmed. All three mid-tier sites have zero Google referrals. Their AWStats "visits" are entirely bot crawl + a small amount of unfiltered direct hits. None of them have earned any organic discovery. This is consistent across the entire long tail of 96 sites in the 50–200 visit band.

The common pattern: 9–15 indexed URLs each. That's not enough surface area for Google to rank against. firth.com has 474 pages and ranks; these have 9 and don't. The difference isn't quality — it's quantity of indexable, unique content per domain.

Strategic recommendations

Tier 1 — Highest leverage

  1. Replicate firth.com's depth on austen.com. Add 50–100 pages of unique Jane Austen content: per-novel deep dives, character profiles, adaptation history, full novel texts with annotations, fan fiction archive index, biography. Domain authority is already there; only content is missing.
  2. Verify all sites in Google Search Console. Without GSC, we're flying blind on what Google thinks of each site. This unlocks query-level data and is free.
  3. Build out tvreviewer.com further. 565 pages already exist; the issue is Google trust on a new domain. Continue adding archival depth (more years, more categories, more cross-linking between years) and earn at least one inbound link from a real TV-industry site.

Tier 2 — Medium leverage

  1. Add schema.org markup to firth.com, austen.com, tvreviewer.com. All three currently have zero schema markers. Adding Person, Movie, Book, CreativeWork markup would compound their existing strengths.
  2. Pick 5 mid-tier sites and turn them into "depth" sites. Don't try to fix all 96. Suggested: austinspring, offgridder, aquachifit, taichipaul, buildercamp. Each gets 20–50 pages of unique, query-targeted content over the next month.
  3. Update memory: bnbhot.com DNS resolves to your droplet now. Earlier guidance to skip it was based on stale info. It can be added to AdSense.

Tier 3 — Lower leverage / housekeeping

  1. Set realistic AdSense expectations. First 1–3 months at ~2,000 real human visits/mo will likely yield $5–30/mo gross. Revenue scales with real human traffic, not awstats numbers. This is normal for a new publisher.
  2. Don't kill the long tail yet. The 60–100 visit "underperformers" aren't actually underperforming the leaders by as much as it looked — they're all in the same "indexed but undiscovered" stage. Wait 6–12 months and re-audit before pruning.
  3. Fix zero-traffic plumbing. Bergeron sites need DNS pointed (waiting on Nancy). barneyfrauenthal.com should be removed from AdSense (password-protected, will never verify). alexsmallenginerepair.com should be deleted entirely.

Bot reality check

For context on how much of "traffic" is bots, here are the top crawler hits across the three drilled-down sites:

tvreviewer.com top bots

firth.com top bots

firth.com gets 4,125 identifiable bot hits/mo — 5x the human traffic. This is normal for any publicly-indexed website. The lesson: AWStats "visits" is an inflated number for every site, everywhere, but the relative ranking still tells you which sites are most attractive to crawlers (and crawlers care about authority, age, and update frequency — rough proxies for the things humans also care about).


The single most important sentence in this report: firth.com proves the WholeTech network model can work — old domain + deep content + tight focus = real organic traffic. Every other site in the network either has the wrong domain age (most), the wrong content depth (almost all), or both. The path forward isn't more sites; it's more depth on the sites you already own.

Generated April 7, 2026 by Claude Code · Source: /var/lib/awstats/ on droplet 143.198.182.180 · wholetech.com · stats hub