6502 → AI: The Full Stack Journey

I spend a meaningful time figuring out how to explain things to AI (Large Language Models, or LLMs) using these smallest number of words…

I spend a meaningful time figuring out how to explain things to AI (Large Language Models, or LLMs) using these smallest number of words (or “tokens”) possible. This led me to compress my 40-year resume down to:

6502 → Pascal → C → JS → CAC → MRR → LTV → NPS → ENPS → M&A → ROI → IRR → KISS → SAFE → LP → SPV → GO → AI

If you (or your AI) get this, you get me. You can feed it into any AI and it will do a great job explaining my career. Let’s also unpack it here.

Phase 1: Bare Metal

Started on the 6502. Atari 800. Counting clock cycles. Squeezing functionality from 8-bit registers when memory was measured in kilobytes and clock speed in megahertz.

This wasn’t programming. It was engineering at the silicon level.

Moved to Pascal, then C. The crucible of structured programming. Pointers. Memory management. Algorithms that actually mattered because you couldn’t hide behind abstraction layers. These languages built operating systems, critical infrastructure, the digital world’s foundation.

Eventually JS (JavaScript) pulled me into the nascent web — systems-level thinking applied to dynamic, distributed browser environments. Different beast, same discipline.

What it taught me: Efficiency isn’t optional. Constraints force creativity. When you understand how the machine actually works — right down to the electron — you make better decisions at every level above it. You develop an almost intuitive sense of what’s possible and what’s expensive.

Phase 2: Operator Mode

Post-dot-com, the language shifted from malloc and free to business metrics:

CAC (Customer Acquisition Cost) — what it costs to get a customer
MRR (Monthly Recurring Revenue) — the heartbeat of SaaS
LTV (Lifetime Value) — what that customer is worth over time
NPS (Net Promoter Score) — are customers delighted?
ENPS (Employee Net Promoter Score) — is the team excited?

No longer just writing code. Building products. Building companies that needed to acquire, retain, and monetize users. Scaling. Finding product-market fit. The delicate dance of growing a great team while keeping customers happy.

This phase was about translating technical effort into tangible business outcomes. The compiler errors became churn rates. The optimization targets became unit economics.

Phase 3: Capital Architecture

With operating experience came a new scale of impact. The focus shifted to strategic deployment of capital:

M&A — Mergers & Acquisitions. Growth through combination.
ROI / IRR — Return on Investment. Internal Rate of Return. The bedrock of financial engineering.
KISS / SAFE — Standardized, founder-friendly instruments for early-stage investments
LP — Limited Partner in funds
SPV — Special Purpose Vehicles for targeted investments

This was about understanding capital structures, deal flow, the art of identifying high-potential ventures. Building value not through code, but through strategic investment and financial architecture.

Different leverage. Same analytical rigor.

Phase 4: Full Circle — Go and AI

GO (Golang) brought me back to my roots. High-performance. Concurrent. Built for the infrastructure that powers the bleeding edge. A renewed appreciation for systems that need to be fast and reliable.

And then AI.

This is where the entire timeline converges.

The 6502 Mindset Meets the AI Mindset

Here’s the thing most people miss: these mindsets are the same mindset.

The 6502 era was about:
• Extreme optimization
• Understanding fundamental constraints
• Wringing every drop of performance from hardware
• Designing efficient algorithms to manage scarce resources

The AI era is about:
• Extreme optimization (tokens, quantization, matrix-math)
• Understanding fundamental constraints (context windows, GPU RAM)
• Wringing every drop of performance from hardware
• Designing efficient algorithms to manage scarce resources

AI is achieving incredible feats with resources that — while vast compared to the 6502 — are still fundamentally finite

The scale changed by orders of magnitude. The core problem didn’t.

My early days taught me to see past hype to understand underlying challenges and opportunities. To ask better questions about scalability, efficiency, and long-term viability. To recognize when something is genuinely novel versus when it’s a familiar pattern in new clothes.

The Takeaway

This journey built my superpower: I can debug a C pointer error AND evaluate the IRR of an M&A deal. I can appreciate the elegance of a well-crafted assembly routine AND understand the financial models justifying a multi-million dollar investment.

More importantly, I can see the connections. The same principles of logic, efficiency, and problem-solving that defined my early days with the 6502 are proving invaluable in navigating AI’s complexities and opportunities.

The tools change. The scale changes. The fundamentals endure.

What’s next? If this journey has taught me anything, it’s that evolution is constant. But I suspect the lessons from that tiny 6502 chip will keep guiding the way — no matter how intelligent our machines become.


Technical note -> (dash + greater-than, the classic ASCII right-arrow) is actually shorter than →. How’s that? → is a unicode character, encoded as UTF-8 on the web and in most LLMs as three bytes: 0xE2 0x86 0x92. -> is more efficient, but → is more attractive. That balance matters, too.