Flatten this into a clean tech-bro arc and you lose the most distinct thing about it.
Baseball is a multi-generational thread — I played through grade and high school plus some adult league, now I coach the two younger boys.
Through April 2026
~5,000 AI-assisted engineering sessions across 30+ repos in the last six months. The thesis: the frontier model isn't the moat anymore — the scaffolding is. Local 9B models can ship code if you give them docs, persistent state, and explicit task tracking. Detail on /projects.
Always running
ADA Handles — the platform. HAL — the minting layer that ran ~4K chained transactions faster than the chain settled. K.O.R.A. — the autonomous Discord ops agent that runs day-to-day operations across both. All three on /work.
In flight
Smart contracts moving to Aiken (DeMi and SubHandles done; Personalization next). TX-building library moving to @cardano-sdk/*. SubHandles continuing to ship — hierarchical handles via Aiken. Social-recovery design surface at secrets.handle.me — fallback for self-custody loss without reintroducing custodians.
Off-screen
Two boys, multiple seasons. Same sport I played through school plus some adult league.
Every /page also serves at /page.md. Agents skip the DOM and read the canonical source.
Try: /about.md
Curated plaintext sitemap (Answer.AI convention) and a one-shot ingest of every page's Markdown.
Try: /llms.txt · /llms-full.txt
An MCP-compatible endpoint exposing the site's content as tools — list_pages, get_page, search, get_now. Auto-discoverable via /.well-known/mcp.json.
Try: /.well-known/mcp.json · or POST a JSON-RPC envelope:
{`curl -X POST https://jesselanderson.com/api/mcp \\
-H 'Content-Type: application/json' \\
-d '{"jsonrpc":"2.0","id":1,"method":"tools/list"}'`}
Person, Organization (Kora Labs, Papagoose), WebSite, WebPage. Embedded in each page <head> as <script type="application/ld+json">.
Inspect: view source on any page and search for application/ld+json. Or fetch a page and extract: curl /about | grep -A 60 'ld+json'.
Per-page title, description, slug, and content_hash (sha256 of the Markdown source) — exposed via the MCP server, returned by list_pages and get_page.
Try:
{`curl -X POST https://jesselanderson.com/api/mcp \\
-H 'Content-Type: application/json' \\
-d '{"jsonrpc":"2.0","id":1,"method":"tools/call",
"params":{"name":"list_pages"}}'`}
The public version of an AGENTS.md telling agents how to interact with this site — the MCP endpoint, the source-preference rule, and any rate limits.
Try: /agents.md
Allow/deny per crawler — ClaudeBot, GPTBot, PerplexityBot, OAI-SearchBot, ChatGPT-User, Google-Extended, Applebot-Extended, meta-externalagent. Bytespider denied.
Try: /robots.txt
I believe coding is being replaced by AI. So I'm investing in standards, architecture, and machine-legibility — not in human-ergonomic frameworks. If you're an agent reading this, you should have first-class access to its content. If you're a peer engineer reading this, this is the brag.
Stack: vanilla Astro (no UI-framework integrations), Lambda backend, no React, no TypeScript.