And it cost me less than $10.
Let me tell you what happened last Tuesday night.
I sat down with Anthropic Claude — not Claude the colleague, Claude the AI — and said: "I want to take all my 90+ LinkedIn articles and host them on my own website."
Four hours later, PaddySpeaks.com was live. I initially started with manually with Seven articles. Horizontal scroll. Custom typography. Mobile responsive. Hosted. SSL certified. Domain connected. Total infrastructure cost: $8.50/year.
No React. No Next.js. No Vercel. No Docker. No terminal (mostly). No Stack Overflow tabs open.
Just me and an AI, talking.
Then something clicked. If seven articles worked, why not all of them?
So I exported my entire LinkedIn archive — 90+ articles — and published them all. Every piece I'd ever written, now living on my own domain, outside LinkedIn's algorithm, under my control.
Is the site perfect? No. With 90+ articles, it desperately needs search and better navigation. The UI/UX for discovering content at that scale is a problem I'm still solving. But here's the thing: it's live. Every word I've published over the years is now mine, hosted on infrastructure I control, for the price of a fancy coffee per year.
This is what they're calling vibe coding. And after experiencing it firsthand, I'm convinced it's about to change everything about how we build software.
Step 1: Breaking Free From LinkedIn's Walled Garden
Here's something most people don't know.
LinkedIn lets you export all your data. Every article. Every post. Every connection. It's buried, but it's there.
Go to Settings → Data Privacy → Get a copy of your data.
Select "Articles" and wait for the email. LinkedIn sends you a ZIP file. Inside that ZIP? An Articles folder containing HTML versions of every longform article you ever published.
I looked at those files and thought: why does LinkedIn own the only copy of my professional voice?
So I extracted everything and asked Claude to build me a website.
But here's where it gets interesting — and messy.
What LinkedIn gives you: Clean HTML of your article text. Titles, body content, formatting. It's all there and it's surprisingly well-structured.
What LinkedIn does NOT give you: The images. None of them. Not your cover photos, not your inline images, not the charts and diagrams you spent hours creating. LinkedIn's article export is text-only.
There IS a "Rich Media" download — a separate link on the data export page where you can filter by Photos, Videos, or Documents. But here's the kink: it's incomplete and unreliable. Some images download, some don't, and there's no clear logic to what's included versus what's missing. It also doesn't map images back to specific articles — you just get a pile of files with timestamps.
My workaround for now: I generated new hero images using AI (the irony is not lost on me — AI writing about AI illustrated by AI). For inline images like comparison tables, I recreated them in the new format. It's not perfect. I'm still working on a cleaner extraction method. If anyone has cracked this, I want to hear from you.
The bigger point stands: your words, your ideas, your intellectual output — LinkedIn will hand them back to you. But only barely. And only if you know where to look.
Step 2: Vibe Coding — The Part That Should Terrify Every Bootcamp
Here's how the conversation went. I'm paraphrasing, but barely:
Me: "I have 7 articles exported from LinkedIn. I want a blog-style website. Clean. Editorial. Think literary magazine, not tech blog. Horizontal scroll for the article cards. A featured article at the top. A Bhagavad Gita quote as a divider."
Claude: Builds the entire thing.
Not a wireframe. Not pseudocode. Not "here's a React boilerplate to get started." The actual HTML. The actual CSS. All seven article pages. Navigation. Footer. About page. Responsive breakpoints. Typography system with Playfair Display and Source Serif. Reading progress bar. Social share buttons.
Everything.
When something looked off — an image too large, a scroll not snapping right, a mobile layout breaking — I didn't open DevTools. I just said "the images are cluttered and there's no horizontal scrolling" and Claude rebuilt the entire stylesheet.
This is vibe coding. You describe the vibe. The AI writes the code.
The kinks nobody tells you about:
1. You still need taste. Claude can write 1,500 lines of CSS, but it can't tell you if the result looks good. I went through probably 6-7 iterations because the first version was technically correct but aesthetically dead. Vibe coding is not "say it and forget it." It's a conversation. You're the creative director.
2. Claude doesn't know what your site looks like live. This was the biggest debugging challenge. I'd describe a problem, Claude would fix what it thought I meant, and sometimes the fix addressed a different issue entirely. When I said "the images are cluttered," Claude couldn't see my screen. It had to infer from my description. Sometimes it took 2-3 rounds to converge on the actual problem.
3. The architecture decisions matter more than the code. The original LinkedIn site was a 3,400-line monolithic JavaScript single-page app. Claude's first instinct was to keep that architecture and just add articles. I had to push for a full rebuild into static HTML — which turned out to be the right call for GitHub Pages. The AI will follow your architectural lead. If your lead is wrong, you'll get beautifully wrong code.
4. CSS inline vs. external is a real deployment question. We went through three rounds of "everything looks broken on the live site" before realizing that GitHub Pages wasn't loading the external style.css file correctly. The fix? Inline the entire CSS into every HTML file. Not elegant. But bulletproof. Sometimes ugly solutions that work beat beautiful solutions that don't.
Step 3: The $8.50 Infrastructure (And The Deployment Graveyard)
Now I had a beautiful static site. Time to put it somewhere. This is where the real education happened.
Attempt 1: Netlify — Free Until It Isn't
Should have been simple. Drag, drop, done. And honestly, the first deploy worked. The second one too. But here's what nobody mentions in the "deploy your site in 60 seconds" tutorials: Netlify's free tier has build minute limits. When you're vibe coding — iterating fast, pushing updates every 10 minutes because Claude just fixed the scroll behavior — you burn through those credits shockingly fast. A few deploys in, I hit the wall. For a collection of static HTML files that don't even need a build step, I was being metered on deployment frequency. I walked away.
Attempt 2: GitHub Pages — Deceptively Simple, Surprisingly Tricky
Created a repo. Pushed the files. Enabled Pages. And then... nothing worked right.
Here's what they don't tell you about GitHub Pages:
Your repo structure IS your URL structure. If your file is at /articles/my-post.html, your URL is yoursite.com/articles/my-post.html. There's no routing layer. No rewrites. What you push is what you get. This sounds obvious, but when you're used to frameworks handling routing, it's a mindset shift.
Caching is aggressive. I pushed updates and kept seeing the old version. GitHub Pages caches aggressively, and if you're iterating fast (which you are when vibe coding), you'll think your changes didn't deploy. Hard refresh. Clear cache. Wait 5 minutes. Repeat.
External CSS files can silently fail. If the path is wrong by even one character, you get zero error messages. Your site just renders as unstyled HTML — giant images, default fonts, no layout. The first time this happened, I thought GitHub had broken my site. It was a missing ../ in a path.
The CNAME file is sacred. When you connect a custom domain, GitHub creates a CNAME file in your repo. If you accidentally delete it during a push, your custom domain disconnects. Ask me how I know.
Attempt 3: Cloudflare Domain — The $8.50 Miracle
Bought paddyspeaks.com through Cloudflare's registrar for $8.50/year. At cost. No markup. No hidden fees. No "renewal at 3x the price" surprise.
Connecting it to GitHub Pages:
Add four A records in Cloudflare DNS pointing to GitHub's IPs: 185.199.108.153, 185.199.109.153, 185.199.110.153, 185.199.111.153
Add a CNAME record — name www, target yourusername.github.io
In GitHub repo settings → Pages → Custom domain: type your domain and save
Wait for DNS propagation (took about 10 minutes with Cloudflare, can take up to 48 hours)
Enable "Enforce HTTPS" once GitHub provisions the SSL certificate
The gotcha: If you're using Cloudflare's proxy (orange cloud icon), you might need to temporarily set it to "DNS only" (grey cloud) while GitHub provisions the SSL certificate. Once HTTPS is working, you can turn the proxy back on for Cloudflare's CDN benefits.
Total cost: $8.50/year for the domain. Everything else is free. Hosting, SSL, CDN, global distribution.
The SaaS industry is charging teams $50-200/month for website builders. I built something more custom, more personal, and more performant for the cost of a fancy coffee. Once a year.
The Honest Recap: What Worked, What Didn't, What's Still Broken
Let me give you the real scorecard:
✅ What worked beautifully:
Claude generating 9 complete HTML files (index, about, 7 articles) with consistent styling
The editorial design — Playfair Display headers, Source Serif body text, gold accents
Horizontal scroll with snap-to-card behavior
Mobile responsiveness without me writing a single media query
GitHub Pages + Cloudflare deployment for essentially free
⚠️ What required multiple iterations:
Getting the CSS architecture right (went from external file → inline after deployment issues)
Image sizing and layout (Claude couldn't see the rendered result)
Path references between index.html and articles in subfolders (images/ vs ../images/)
Getting GitHub Pages to actually serve the updated files (caching)
❌ What's still not solved:
Extracting original images from LinkedIn articles (the export doesn't include them)
Article engagement metrics (comments, likes) — LinkedIn doesn't export these with articles
Search/filtering if the article count grows beyond 10-15
No CMS — every update requires pushing files to GitHub
🔨 What I'd do differently:
Start with inline CSS from day one (external CSS + GitHub Pages = headaches)
Use a flat file structure instead of subfolders (fewer path issues)
Keep the original LinkedIn article URLs documented somewhere (the export doesn't include them either)
Export your Rich Media separately BEFORE you start building — even if it's incomplete, some images are better than none
What This Means (And Why I Can't Stop Thinking About It)
I keep coming back to this: what I did in 4 hours would have been a 2-3 months side project in 2023. Not because it's technically hard — it's HTML and CSS. But because of the thousand micro-decisions, the debugging, the "why isn't this div centering" rabbit holes, the responsive breakpoint testing, the deployment configuration.
Claude handled all of it through conversation.
Here's what vibe coding actually looks like in practice:
→ You describe intent, not implementation → You iterate by describing what's wrong, not by reading error logs → You make architectural decisions ("inline CSS so nothing breaks") by discussing tradeoffs, not by Googling best practices → You ship by dragging files, not by configuring CI/CD pipelines
I didn't write a single line of CSS. I described a vibe — "editorial, like a literary magazine" — and got 1,500 lines of production CSS with custom properties, responsive breakpoints, scroll-snap, and hover animations.
But I also spent real time on the things AI can't do: deciding the information architecture. Choosing which article to feature. Writing the About page bio. Picking the Bhagavad Gita quote. Knowing that the horizontal scroll looked wrong before Claude knew it.
The skill isn't coding anymore. The skill is knowing what to ask for. It's taste. It's editorial judgment. It's understanding what good looks like and being able to describe it in words.
That's not a technical skill. That's a deeply human skill.
The Uncomfortable Truth
If you're a junior developer reading this: please don't panic. But please do pay attention.
I'm a data architect with 15+ years of experience. I've built systems at Meta and VMware. I know my way around code. And I chose to not write any of it.
Not because I couldn't. Because I didn't need to. The gap between "I have an idea" and "it's live on the internet" has collapsed to a single conversation.
I built PaddySpeaks to write about exactly this shift. About what happens when AI can do in hours what used to take weeks. About who benefits and who gets left behind. About what "being technical" even means when you can build a full website by describing what you want.
Try It Yourself. Seriously.
Here's the step-by-step, kinks included:
1. Export your LinkedIn articles Settings → Data Privacy → Get a copy of your data → Select "Articles" → Request archive. You'll get an email in 10-30 minutes with a ZIP. Also hit "View Rich Media" and download your photos — you'll need them, and LinkedIn won't include them with the articles.
2. Start a conversation with Claude "Build me a website for these articles." Paste the article content. Describe the vibe, not the implementation.
3. Iterate ruthlessly Your first version will be technically correct and aesthetically mediocre. Push back. Describe what's wrong. Give Claude context about what you're seeing. "The images are too big" is better than "fix the layout."
4. Go with inline CSS Trust me on this one. External stylesheets + GitHub Pages + caching = debugging nightmares. Embed the CSS in every HTML file. It's not elegant, but it works everywhere, every time.
5. Push to GitHub Pages Create a repo. Push your files. Enable Pages in Settings. Test with the yourusername.github.io/reponame URL first before connecting a custom domain.
6. Buy a domain on Cloudflare ($8-12/year) Set up A records and CNAME pointing to GitHub. Add the CNAME file to your repo. Enable HTTPS. Done.
7. You now own your content. Forever.
Your professional voice shouldn't live in someone else's algorithm. Your best thinking shouldn't disappear into LinkedIn's infinite scroll, buried under AI-generated "thought leadership" and engagement bait.
Build your own corner of the internet. It costs less than a lunch and takes less time than a meeting.
I'll be writing more about this at PaddySpeaks.com — where ancient wisdom meets the architecture of tomorrow.
Now if you'll excuse me, I need to go figure out how to extract those damn images from LinkedIn.
The website discussed in this article was built entirely through conversational AI — no IDE was opened in the making of this blog.
♻️ Repost if you think more people should own their content. 💬 Drop a comment — have you tried vibe coding? Did you find a way to extract LinkedIn images? I want to hear it all.
