Back to blog

How I Built This Website and Why I Chose This Stack

11 min read technology personal

I'm not a frontend developer. I'm a Head of Data, and my usual stack is Python, SQL, Airflow, dbt, and BigQuery. I have some SEO experience, though it's mostly theoretical — I understand the principles, I've read a lot about technical SEO, but I've never done it professionally. Still, when I decided to make a personal website, I didn't want to take WordPress or Wix, click three buttons, and get a template page that looks like a million others. I wanted full control — over the code, over performance, over how my name appears in Google. So I built everything from scratch.

This article isn't a tutorial or a guide. It's the story of my process: what decisions I made, why exactly those, and what came out of it.

Why Not WordPress?

Let's start with the most obvious question. WordPress powers 43% of the web. It's free, there are millions of themes, plugins, hosting options for any budget. Why not use it?

Because WordPress is a dynamic site. Every request is a PHP script that connects to MySQL, assembles the page on the fly, and serves it. For a blog with three articles, that's like using a cannon to kill a sparrow. Plus — vulnerabilities: WordPress requires constant updates, patches, bot protection. For a personal site, that's unnecessary overhead.

What I needed was a static site — a set of HTML files that nginx serves instantly, without a database, without server-side logic, without PHP. Just files. Maximum speed, maximum simplicity, maximum security.

Choosing a Generator: Why Eleventy

When you're choosing a Static Site Generator (SSG), the options are vast: Hugo, Jekyll, Gatsby, Next.js, Astro, Eleventy... I tested several.

Hugo — written in Go, blazing fast, but Go templates are a separate language you need to learn from scratch. Documentation exists, but for non-trivial things you start googling for hours. Sure, AI assistants now solve this problem — you can ask them to write any template. But I want to at least have a basic understanding of what's happening in the code. Because AI makes dumb people even dumber, and makes smart people 50–60% more productive.

Next.js / Gatsby — React-based. For a personal site with three pages, dragging React, Webpack (or Vite), SSR/SSG configuration, and 200 MB of node_modules along — that's overkill. I'm not building a SaaS product, I need a blog.

Astro — interesting option, but for my case — too much "magic" under the hood. Islands, component model, its own .astro format — that’s great for complex projects, but for a personal site with a blog I wanted something simpler and more transparent.

Eleventy (11ty) won for several reasons:

  1. Simplicity. Eleventy doesn't force a framework on you. No React, Vue, Svelte — just templates and content. You write Markdown, pick a template engine (Nunjucks, Liquid, Handlebars), and get HTML. That's it
  2. Flexibility. You control every byte of output. No "magical" pages or routing — there are files and folders that become URLs. Intuitive
  3. Zero client JS by default. Eleventy doesn't inject any JavaScript into built pages. All JS is only what you add yourself. For speed, this is critical
  4. ESM support. Eleventy 3.x fully supports ES Modules — no require/module.exports, pure modern JavaScript
  5. Data cascade. The cascading data system — where a JSON file in a folder automatically applies to all files in it — was perfect for multilingual architecture

Multilingual: Three Languages Without Plugins

One of the main requirements — the site must work in multiple languages: Ukrainian (primary) and English (for international audience).

I don't use any i18n plugin. The architecture is dead simple:

  • Ukrainian pages live in the root src/ (URL: /)
  • English — in src/en/ (URL: /en/)

Each language folder has its own JSON file (e.g., en.json), which through the data cascade adds a lang variable to all files within it. Templates use `` for UI translations, and page content is simply written manually in each language.

Hreflang tags are generated automatically via eleventyComputed.js — every page knows its alternates. The sitemap includes all three language versions with proper hreflang annotations. All of this — without a single external plugin.

CSS: Tailwind 4.x and CSS-First Approach

For styling, I chose Tailwind CSS. Not because it's trendy, but because the utility-first approach fits perfectly with templates — you see all styles right in the HTML, without jumping between files.

Tailwind 4.x changed the game — configuration now lives directly in the CSS file via @theme {}, without a separate tailwind.config.js. Plugins are connected via @plugin, and template paths via @source. Everything in one main.css file.

Custom components (glass-effect cards, buttons, gradient text) are defined via @utility — this lets you write .glass-card or .btn-primary as regular classes, but with all the benefits of Tailwind's tree-shaking.

Fonts — Inter for text, JetBrains Mono for code. Both self-hosted as variable fonts in WOFF2, loaded with font-display: swap so text appears immediately while the font loads in the background. Zero requests to Google Fonts CDN — fewer connections, faster loading.

CSS build — via @tailwindcss/cli, without PostCSS. At build time, Tailwind tree-shakes and keeps only the classes actually used in templates. Final CSS — under 15 KB after minification.

JavaScript: Less Is More

No React, Vue, jQuery, or any framework. All JavaScript — one file main.js, loaded with defer.

What it does:

  • Scroll animationsIntersectionObserver adds the .reveal class to elements when they enter the viewport. Smooth appearances without libraries
  • Mobile menu — open/close burger menu
  • Language switcher — language switching via simple <a> links (each language version is a separate URL, which is canonical for SEO)
  • Engagement widgets — views, likes, ratings, comments — fetch() to the API

All JS is minified at build time. Final size — a few kilobytes.

Backend: Express + SQLite for Engagement

The only dynamic part of the site is "engagement": article views, likes, 5-star ratings, and comments. This requires an API.

I wrote it on Express + better-sqlite3 — a minimalist server that:

  • Stores data in a single SQLite file (no MySQL, no PostgreSQL, no cloud databases)
  • Uses WAL mode for fast reads/writes
  • Has prepared statements for all queries — zero SQL injection, minimal overhead
  • Includes rate limiting (in-memory) against spam
  • Has a comment moderation system (pending → approved)

The entire API is one file server.js at ~450 lines. Runs as a systemd service on the server, nginx proxies /api/ to it. Why SQLite and not PostgreSQL? Because for a blog with a dozen articles, I don't need a separate database server. SQLite is one file on disk. Backup — one cp command. Done.

Hosting: Oracle VM + nginx + Cloudflare

I don't use Vercel, Netlify, or GitHub Pages. The site runs on an Oracle Cloud VM (Always Free tier — ARM64, 1/8 OCPU core, 1 GB RAM, 480 Mbit/s network). The absolute minimum, but for a static site — more than enough. Free with full control over the server.

Nginx — as web server, configured for maximum performance:

  • sendfile on, tcp_nopush, tcp_nodelay — minimizing system calls
  • open_file_cache — file descriptor caching so nginx doesn't hit disk on every request
  • ssl_buffer_size 4k — smaller TLS record for faster TTFB
  • brotli_static on + gzip_static on — nginx serves pre-compressed .br and .gz files with zero CPU cost at runtime
  • Separate caching rules: static assets — immutable, max-age=31536000 (1 year), HTML — shorter TTL so updates reach users quickly

Cloudflare sits in front of nginx as CDN:

  • HTTP/3 (QUIC) + 0-RTT — minimal latency
  • Edge caching for 30 days for HTML (with purge on every deploy)
  • Early Hints (103) — browser starts preloading CSS/JS before receiving HTML
  • DDoS protection out of the box
  • SSL via Origin Certificate

Deploy: git push → production in 2 minutes

The entire deployment is automated via GitHub Actions. One git push to main triggers the pipeline:

  1. Build & Test — npm ci → build CSS → build Eleventy → compress assets → lint HTML
  2. Deploy — rsync the built _site/ folder to the server via SSH
  3. Reload nginxsudo nginx -t && sudo systemctl reload nginx
  4. Purge & Warm Cloudflare cache — purge_everything → wait → curl all URLs from sitemap.xml

From push to updated site — under 2 minutes. Zero manual intervention.

The compression script (compress.mjs) walks through all files in _site/ and generates .br and .gz versions. This means nginx doesn't spend CPU on compression at runtime — it just serves ready-made files. Result: ~80% size savings, zero server load.

Compression and Optimization

I'm obsessed with speed (occupational hazard — when you work with data, milliseconds matter). Here's what I did:

Pre-compression. The compress.mjs script generates Brotli (.br) and Gzip (.gz) files for all HTML, CSS, JS, JSON, XML, SVG, and TXT. Nginx serves them via brotli_static and gzip_static — with zero CPU cost.

HTML minification. Eleventy minifies HTML at build time via html-minifier-terser — removes comments, excess whitespace, minifies inline CSS and JS.

Image optimization. The Eleventy Image shortcode generates AVIF + WebP + JPEG in multiple sizes (640, 960, 1280px) with loading="lazy" and decoding="async". The browser picks the optimal format and size automatically.

Service Worker. Caches critical assets on first visit. Repeat visits are instant. An offline page is shown when there's no internet.

Self-hosted fonts. Inter and JetBrains Mono as WOFF2 variable fonts — one file per font instead of a dozen separate ones. No external HTTP requests.

Result: TTFB ~30ms on Cloudflare edge, LCP < 1s, total page size — under 50 KB gzipped. PageSpeed Insights shows 99–100 for both mobile and desktop.

Testing: 102 Unit Tests + E2E

For a personal site, this might seem like overkill, but tests give me confidence that changes don't break anything. And in the AI era, not writing tests is simply rude and disrespectful — even to yourself. AI generates tests in seconds, all you have to do is run them. There's no excuse.

Vitest (unit tests):

  • HTML validation — every generated page is checked for correctness
  • i18n — all translation keys exist for all three languages
  • Schema.org — JSON-LD markup is valid and contains correct data
  • TTFB — real requests to klimnyk.dev verify the server responds in < 200ms

Playwright (E2E):

  • Navigation works correctly across all language versions
  • Mobile menu opens and closes
  • Language switcher leads to correct pages
  • Meta tags (title, description, og:*) are present

Total: 102 unit tests and 62 E2E tests. Run in CI before every deploy.

SEO: Not Just Meta Tags

As a Head of Data, I know the value of organic traffic. SEO here isn't an afterthought — it's one of the main reasons the site exists.

  • Schema.org — 7 types of JSON-LD markup: WebSite, Person, BreadcrumbList, WebPage, BlogPosting, Blog, ProfilePage
  • Hreflang — proper language alternates + x-default
  • Sitemap.xml — auto-generated with all URLs and hreflang
  • robots.txt — properly configured, doesn't block CSS/JS
  • llms.txt — file for LLM crawlers per the llmstxt.org standard, auto-updated
  • GEO meta tags — citation_*, article:published_time, max-snippet
  • Canonical URLs — each language version is a separate page with canonical pointing to itself

Why Not Simpler?

With AI, I spent one evening on the base structure of the site and a couple of hours per week on writing articles and making tweaks. I could have made the site on Notion, Tilda, or WordPress in one evening. But then I wouldn't have:

  • 30ms TTFB
  • Full control over markup and SEO
  • Automatic deployment from GitHub
  • Zero hosting bill
  • Confidence that the site can handle any load (it's static!)
  • The satisfaction of building everything myself

This is a personal project, and there's joy in the engineering. Yes, I'm a Head of Data, not a frontend developer. But technology is technology, and I enjoy understanding how things work under the hood.

What's Next

The site is live, deploys automatically, passes tests. Content appears when there's something to say. I don't set a goal to publish weekly — this is a personal blog, not a media outlet. But every article is made with the same attention to quality as the site itself.

The code lives on GitHub in a private repository. No secrets, no magic — just a well-thought-out stack that does its job.

(0)

Comments (0)