I built my portfolio the way I build products. With AI, in code, from scratch.
Designed, not pixel-polished. Built in days, not months.
0hours
Concept to live
Most portfolios are brochures.
Static pages. Handcrafted in Webflow. Updated once a year if you're lucky. A resume pretending to be a product.
Mine is different. It's interactive. It talks back. It shows how I think by letting you talk to me directly.
If I can't show how I think by talking to me, what's the point?
This case study is about how I built it. Not as a portfolio flex. As proof that the process I preach is the process I actually use.
The Stack
Next.js 16
App Router, SSE streaming, edge-ready API routes
React 19
Concurrent features, hooks-only architecture
Framer Motion
All animations, scroll-driven reveals
DaisyUI + Tailwind 4
Theme system (dark/light), zero custom CSS
PostHog
Analytics on every interaction
Zero database. Zero auth. Zero CMS. Just product.
How AI Actually Helped
Concept & Narrative
AI helped me define my positioning, find my voice, and write like I actually talk. Not a script. A mirror.
Design Decisions
ASCII art approach, deterministic animation system, chat-first architecture. Real technical decisions, AI-assisted thinking.
Code Velocity
Every component, every API, every animation. Built with Claude Code. Not because I couldn't code it alone — because thinking at full speed matters.
AI didn't replace the thinking. It compressed the time between thinking and shipping.
The ASCII Art: Three Animation States, All Deterministic
The ASCII portrait on the home page is the most technically interesting piece of this site. No canvas. No external library. Just vanilla React and requestAnimationFrame.
Try toggling the effects below:
1. Scan-Line Entry (900ms)
Rows reveal from top to bottom at 28ms intervals. The active row gets a saturate/blur flash — CRT monitor effect. No randomness. Every animation is reproducible.
2. Noise Field Idle (~12fps ripple)
Character opacity is computed per-frame using layered sine waves. The formula creates a rippling shimmer effect across the whole image.
const val =
Math.sin(row * 0.6 + phase * 0.18) * 0.15 +
Math.sin(col * 0.4 + phase * 0.22 + row * 0.3) * 0.15 +
Math.sin((row + col) * 0.5 + phase * 0.1) * 0.1;
opacity = Math.min(1.0, Math.max(0.35, 0.7 + val));Fully deterministic. Same frame = same animation every time. No random number generation cost.
3. Glitch Slices When Responding (~25fps bursts)
While the AI is streaming, every 400ms a random row shifts horizontally by 2-4px for 80ms. VHS-style corruption. Also deterministic via an integer hash: (row * 31 + col * 17 + phase * 7) % 100 < 2
Result: Smooth, responsive visuals. No jank. Zero external dependencies. Every character is a `<span>` with computed opacity.
Chat API: Simple, Secure, Smart
DeepSeek API Proxy
Every message streams through a Next.js API route that acts as a proxy. Client sends messages, server forwards to DeepSeek, strips the envelope, re-serializes as SSE (Server-Sent Events) for the client. Client never sees your API key.
Dual-Layer Rate Limiting
Server-side (authoritative): 25 messages per IP per day. Anti-bot: 3+ messages in 5 seconds = blocked.
Client-side (UX): localStorage tracks usage. Decrements optimistically before the response arrives. If server rejects, the client state doesn't matter — the server is the source of truth.
System Prompt as Persona Engine
Every response goes through the same system prompt. It tells the AI: be Matheus. Be direct. Don't generate code. Stay on topic. It's not a guardrail. It's a voice.
Message Persistence
Full conversation history is serialized to localStorage. Refresh the page — chat history stays. User never loses context. No backend database needed.
Metrics & Architecture
Built for Simplicity
Runtime dependencies
No bloat. Every package earns its place.
Custom CSS
DaisyUI + Tailwind handle all styling. Semantic classes only.
Total bundle (gzipped)
Full app, animations, chat, analytics. All in 80kb.
Lighthouse score
Performance, accessibility, best practices.
AI Integration & Velocity
Built with AI
Every component, every API route, every animation.
Hours concept → live
Thinking → designing → coding → deploying. Same week.
Thinking + design saved
Equivalent solo time. AI compressed iteration cycles.
Manual deployment steps
Git push → vercel auto-deploys. No waiting, no friction.
The point:This isn't a performance stunt. It's a working example of the principles I believe in. Minimal dependencies. Deterministic code. Speed as a feature, not an accident.
This isn't a portfolio about design. It's a designed product.
The process I talk about — shipping fast, using AI as a tool, building in code, no handoff decks — is the exact process I used to build this. No exceptions. No theory. Just practice.