How I built this site
This site runs on a small, reliable stack for a personal profile and a Markdown blog. I prioritised clarity over novelty: predictable content files, simple rendering, strong SEO defaults, and tools to make writing fast.
Architecture & Content
The site uses the Next.js App Router with TypeScript and PNPM. All content is stored in the repository, organised by type: posts/page.md for the homepage, posts/career/page.md for my career history, posts/testimonials/page.md for testimonials, and posts/blog/<slug>/ for each blog post. A post consists of two files: meta.json for metadata and post.md for the content. The first H1 in post.md is used as the page title.
A build process handles drafts and scheduled posts, hiding them in production but showing them with status badges in development. The site is deployed on Vercel.
Pipeline & Tooling
- Rendering: Markdown is rendered using
unified,remark, andrehype. - Title Extraction: A script parses the Markdown AST to extract the first
H1as the page title, rendering the rest as the body. - Scaffolding:
pnpm run new:postis a small CLI script that prompts for a title, generates a clean slug, and creates themeta.jsonandpost.mdfiles. - Aggregation: A
/llms.txtroute serves a plain-text version of all site content for AI models.
Writing a new post
- Run
pnpm run new:postand provide a title. - The script creates a new folder in
posts/blog/with ameta.jsonfile (marked as unpublished) and apost.mdfile. - Edit the metadata and write the post.
- Set
published: trueto publish.
Styling and SEO
Styling uses Tailwind CSS, including the @tailwindcss/typography plugin for readable article text. The font, Public Sans, is loaded via next/font to prevent layout shift. I use Base UI for unstyled, accessible components like the main navigation menu.
The Next.js Metadata API generates page titles, descriptions, and Open Graph cards. A sitemap.xml and robots.txt are also generated dynamically.
Why this approach
This setup optimises for writing speed and simplicity. Content lives with the code, the rendering path is testable, and the site avoids common issues like broken links or missing metadata. It's a solid foundation that can be extended later.
Working with AI
I used OpenAI Codex to accelerate routine work, keeping final decisions in my hands. I set the direction and constraints, then asked for focused changes like scaffolding pages, building the post generator, or drafting documentation. I reviewed every patch for consistency and correctness.
For example:
- The AI assistant implemented the content loaders and the
unified/remark/rehypepipeline based on my specifications. - It wrote the initial
new:postCLI, which I then refined. - It also fixed bugs, such as a type error on the
/resumeroute. - This post was drafted by an AI from my project notes and commit history, then edited by me for clarity.
The result is a small, extensible codebase. The AI accelerated the work, but the design, constraints, and final review were human-led.