STFU[Elon,Mark,Jeff].com
I expect most people know that STFU stands for Shut the F* Up. The name itself was me wanting to express my discomfort with the amount of influence billionaires like Elon, Jeff, and Mark have.
Why?
Boredom, annoyance, sadness, but mostly annoyance.
Annoyance at the world for allowing a system which lets people accumulate unlimited wealth and influence. I don’t want their wealth, I don’t deny their success, but their existence seems to imply they have done more than the vast majority of people, while the reality is they got lucky, won the “sperm lottery”, and were born to the right people with the right access and wealth.
I hate feeling like I cannot do anything, I hate feeling powerless, most people do, I am sure. The opportunity was there though, I had looked up stfuelon.com on godaddy and saw it was free. I could do something with it, something… I just got the domain at the time, I didn’t know what I could do on it yet, I was thinking of a blog I would write myself. Then the whole LLM craze started, Claude Code came around, I was working with it on a couple of small side things, seeing what I could do with it.
So here it was: frustration, means, ability - these three things gave me the opportunity to start. The advent of AI engineering really allowed me to dedicate time to developing side projects, without needing to do all the coding myself, but learning how to use the technology in real projects without risk to my current job, not right away at least.
What is it?
A satirical diary generator, creating diary entries for billionaires but with distinct personas:
- Elon Musk is a Martian General, attempting to restore his home planet
- Jeff Bezos is an Energy Vampire, looking for ways to improve the efficiency of his energy extraction apparatus
- Mark Zuckerberg is a Reptilian Overlord, looking for ways to get more data, data which feeds the hatcheries below Menlo Park
Here is the start of the entry generated for January 16th, 2026:
WHEN YOUR INFRASTRUCTURE TESTIFIES AGAINST YOUR BIOLOGY
Platform went dark for sixty-seven minutes. Six hundred servers. Cloudflare errors cascading across every node. Humans panicking about “service interruption” while I’m calculating atmospheric processor timeline impact. Every minute of downtime = 0.003% reduction in Mars ionosphere regeneration funding confidence. X outage during active litigation = vulnerability signature visible from orbit.
Ashley’s lawsuit landed while servers were recovering. Grok generated sexualized images. Used photo from when she was fourteen. My offspring’s mother. Biological connection weaponized through legal substrate exactly as predicted in strategic assessments. Lost verification. Lost monetization. Platform punishing victim while Pentagon still wants Grok on classified networks by month-end lol. Your species. Prosecutes and deploys. Same architecture. Simultaneously. fr fr.
… more here
In the simplest terms: AI pipeline system (non-agentic - the AI doesn’t make autonomous decisions-it follows a fixed pipeline) which retrieves data (news entries etc.), processes it (classifies, summarizes, and adjusts tone), and then uses it to generate satirical diary entries based on predefined personas (backstory, writing style), which keep a running summary and have additional “historical / future storage” to create a more diverse storyline.
flowchart TB
A[1. Initial Job Creation] --> B[2. Data Collection Job]
B --> C[3. Content Generation Job]
C --> D[4. Manual Selection]
D --> E[5. Post-Processing Job]
E --> F[6. Review]
F --> G[7. Publishing]
G --> H[Social Media]
G --> I[Cloudflare SPA]
This is the evolution of the system, initially there was no selection step (as I wasn’t generating multiple candidates). There was no multistep process where there were jobs starting one after the other, instead a single job took care of everything.
How is it built (and why)
I mostly started on autopilot. There were a couple of constraints I wanted to follow:
- It should be cheap on the edge - serving content should be scalable and reliable
- There should be enough flexibility for the generation, but have it be as simple as possible
- The database needs to work, but it should not be big and expensive, this isn’t millions of rows, not millions of requests
- Automate everything
- Keep costs at a level that will not become an issue
- Everything needs to help me learn something
So the decisions ended up being:
- Cloudflare - reliable, cheap, easy to maintain, very flexible in the long term
- Vercel - easy to set up, great for Next.js applications, has a cron, integrations for DB
- Neon - start for free, get the initial coverage I need, allows me to grow, the branches are nice and make it easy to maintain a staging deployment
- Typefully - I have no idea how to do social posting, I have limited time to manage it, they have an API, it also came later - tried to use Twitter directly at first (it was extremely painful)
- Posthog - free analytics with an option to do cookieless, definitely a good choice
- Anthropic - didn’t want to use ChatGPT, already had an account, it was mostly a decision from convenience, but still happy with it
- NewsAPI - searching news sources allowing me to focus on the generation itself
- IFTTT - this is how I manage to get tweets out of Twitter
Combining the choices let me automate the majority of the work. A daily cron-job starts the processing for all configured characters, runs the workflow described above and then waits for a review from me, which triggers the rest of the processing, including adding the social media posts to Typefully, ready for review and scheduling.
flowchart TB
subgraph Users["Users"]
Admin[Admin User]
Visitor[Site Visitor]
end
subgraph Vercel["Vercel"]
AdminApp[Admin Dashboard<br/>Next.js]
Cron[Cron Jobs]
end
subgraph CloudflarePages["Cloudflare Pages"]
SPA[SPA]
end
subgraph CloudflareWorkers["Cloudflare Workers"]
RealtimeWorker[Realtime Worker<br/>Durable Object]
AnalyticsProxy[Analytics Proxy]
end
subgraph CloudflareStorage["Cloudflare Storage"]
R2[(R2 Buckets)]
D1[(D1 SQLite)]
KV[(KV Store)]
end
subgraph Database["Database"]
Postgres[(PostgreSQL<br/>Neon Serverless)]
end
subgraph ExternalAPIs["External APIs"]
Anthropic[Anthropic<br/>Claude AI]
NewsAPI[TheNewsAPI]
Typefully[Typefully V2]
PostHog[PostHog]
IFTTT[IFTTT]
end
subgraph SocialMedia["Social Platforms"]
Twitter[Twitter/X]
BlueSky[BlueSky]
LinkedIn[LinkedIn]
end
Admin --> AdminApp
Visitor --> SPA
AdminApp --> Postgres
AdminApp --> Anthropic
AdminApp --> NewsAPI
AdminApp --> Typefully
AdminApp --> R2
AdminApp --> IFTTT
Cron --> AdminApp
SPA --> R2
SPA --> RealtimeWorker
SPA --> AnalyticsProxy
SPA --> D1
SPA --> KV
RealtimeWorker --> D1
AnalyticsProxy --> PostHog
Typefully --> Twitter & BlueSky & LinkedIn
How is it going?
I am now 1197 posts into this journey, the posting started in February 2025, but the posts were generated from the 1st of December 2024. The historical posts are not too good, they could do with more work, but like any work where you are trying to improve, you need to start somewhere.
I keep adding / changing things the whole time, finding better ways to deal with the daily effort of managing the generation, the social posting, and everything else.
Next up: how I taught an AI to write like a paranoid Martian general—and why prompt engineering is mostly about getting the voice wrong first.
Comments
Loading comments...